Reader small image

You're reading from  Asynchronous Programming in Rust

Product typeBook
Published inFeb 2024
PublisherPackt
ISBN-139781805128137
Edition1st Edition
Right arrow
Author (1)
Carl Fredrik Samson
Carl Fredrik Samson
author image
Carl Fredrik Samson

Carl Fredrik Samson is a popular technology writer and has been active in the Rust community since 2018. He has an MSc in Business Administration where he specialized in strategy and finance. When not writing, he's a father of two children and a CEO of a company with 300 employees. He's been interested in different kinds of technologies his whole life and his programming experience ranges from programming against old IBM mainframes to modern cloud computing, using everything from assembly to Visual Basic for Applications. He has contributed to several open source projects including the official documentation for asynchronous Rust.
Read more about Carl Fredrik Samson

Right arrow

Coroutines and async/await

Now that you’ve gotten a brief introduction to Rust’s async model, it’s time to take a look at how this fits in the context of everything else we’ve covered in this book so far.

Rust’s futures are an example of an asynchronous model based on stackless coroutines, and in this chapter, we’ll take a look at what that really means and how it differs from stackful coroutines (fibers/green threads).

We’ll center everything around an example based on a simplified model of futures and async/await and see how we can use that to create suspendable and resumable tasks just like we did when creating our own fibers.

The good news is that this is a lot easier than implementing our own fibers/green threads since we can stay in Rust, which is safer. The flip side is that it’s a little more abstract and ties into programming language theory as much as it does computer science.

In this chapter, we’ll...

Technical requirements

The examples in this chapter will all be cross-platform, so the only thing you need is Rust installed and the repository that belongs to the book downloaded locally. All the code in this chapter will be found in the ch07 folder.

We’ll use delayserver in this example as well, so you need to open a terminal, enter the delayserver folder at the root of the repository, and write cargo run so it’s ready and available for the examples going forward.

Remember to change the ports in the code if you for some reason have to change what port delayserver listens on.

Introduction to stackless coroutines

So, we’ve finally arrived at the point where we introduce the last method of modeling asynchronous operations in this book. You probably remember that we gave a high-level overview of stackful and stackless coroutines in Chapter 2. In Chapter 5, we implemented an example of stackful coroutines when writing our own fibers/green threads, so now it’s time to take a closer look at how stackless coroutines are implemented and used.

A stackless coroutine is a way of representing a task that can be interrupted and resumed. If you remember all the way back in Chapter 1, we mentioned that if we want tasks to run concurrently (be in progress at the same time) but not necessarily in parallel, we need to be able to pause and resume the task.

In its simplest form, a coroutine is just a task that can stop and resume by yielding control to either its caller, another coroutine, or a scheduler.

Many languages will have a coroutine implementation...

An example of hand-written coroutines

The example we’ll use going forward is a simplified version of Rust’s asynchronous model. We’ll create and implement the following:

  • Our own simplified Future trait
  • A simple HTTP client that can only make GET requests
  • A task we can pause and resume implemented as a state machine
  • Our own simplified async/await syntax called coroutine/wait
  • A homemade preprocessor to transform our coroutine/wait functions into state machines the same way async/await is transformed

So, to actually demystify coroutines, futures, and async/await, we will have to make some compromises. If we didn’t, we’d end up re-implementing everything that is async/await and futures in Rust today, which is too much for just understanding the underlying techniques and concepts.

Therefore, our example will do the following:

  • Avoid error handling. If anything fails, we panic.
  • Be specific and not generic. Creating...

async/await

The previous example could simply be written as the following using async/await keywords:

async fn async_main() {
    println!("Program starting")
    let txt = Http::get("/1000/HelloWorld").await;
    println!("{txt}");
    let txt2 = Http::("500/HelloWorld2").await;
    println!("{txt2}");
}

That’s seven lines of code, and it looks very familiar to code you’d write in a normal subroutine/function.

It turns out that we can let the compiler write these state machines for us instead of writing them ourselves. Not only that, we could get very far just using simple macros to help us, which is exactly how the current async/await syntax was prototyped before it became a part of the language. You can see an example of that at https://github.com/alexcrichton/futures-await.

The downside is of course that these...

c-async-await—concurrent futures

Okay, so we’ll build on the last example and do just the same thing. Create a new project called c-async-await and copy Cargo.toml and everything in the src folder over.

The first thing we’ll do is go to future.rs and add a join_all function below our existing code:

ch07/c-async-await/src/future.rs

pub fn join_all<F: Future>(futures: Vec<F>) -> JoinAll<F> {
    let futures = futures.into_iter().map(|f| (false, f)).collect();
    JoinAll {
        futures,
        finished_count: 0,
    }
}

This function takes a collection of futures as an argument and returns a JoinAll<F> future.

The function simply creates a new collection. In this collection, we will have tuples consisting of the original futures we received and a bool value indicating whether the future...

Final thoughts

Before we round off this chapter, I want to point out that it should now be clear to us why coroutines aren’t really pre-emptable. If you remember back in Chapter 2, we said that a stackful coroutine (such as our fibers/green threads example) could be pre-empted and its execution could be paused at any point. That’s because they have a stack, and pausing a task is as simple as storing the current execution state to the stack and jumping to another task.

That’s not possible here. The only places we can stop and resume execution are at the pre-defined suspension points that we manually tagged with wait.

In theory, if you have a tightly integrated system where you control the compiler, the coroutine definition, the scheduler, and the I/O primitives, you could add additional states to the state machine and create additional points where the task could be suspended/resumed. These suspension points could be opaque to the user and treated differently...

Summary

Good job! In this chapter, we introduced quite a bit of code and set up an example that we’ll continue using in the following chapters.

So far, we’ve focused on futures and async/await to model and create tasks that can be paused and resumed at specific points. We know this is a prerequisite to having tasks that are in progress at the same time. We did this by introducing our own simplified Future trait and our own coroutine/wait syntax that’s way more limited than Rust’s futures and async/await syntax, but it’s easier to understand and get a mental idea of how this works in contrast to fibers/green threads (at least I hope so).

We have also discussed the difference between eager and lazy coroutines and how they impact how you achieve concurrency. We took inspiration from Tokio’s join_all function and implemented our own version of it.

In this chapter, we simply created tasks that could be paused and resumed. There are no event...

lock icon
The rest of the chapter is locked
You have been reading a chapter from
Asynchronous Programming in Rust
Published in: Feb 2024Publisher: PacktISBN-13: 9781805128137
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
undefined
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime

Author (1)

author image
Carl Fredrik Samson

Carl Fredrik Samson is a popular technology writer and has been active in the Rust community since 2018. He has an MSc in Business Administration where he specialized in strategy and finance. When not writing, he's a father of two children and a CEO of a company with 300 employees. He's been interested in different kinds of technologies his whole life and his programming experience ranges from programming against old IBM mainframes to modern cloud computing, using everything from assembly to Visual Basic for Applications. He has contributed to several open source projects including the official documentation for asynchronous Rust.
Read more about Carl Fredrik Samson