A while ago I realized that I was a visual learner which can be frustrating at times since some concepts might take me a bit longer to fully understand until I create the proper mental image(s) for it (or somebody else does it for me).

When I started wrapping my head around Async programming in Rust I felt like I was missing some of those images. What follows is my attempt visualize the concepts around async programing.

Traditional threaded applications

When creating multi-threading applications, if you wanted to execute multiple tasks at the same time you’ll need the same number of threads as tasks. So you’ll have a 1-1 mapping between tasks and threads. Let’s imagine for example, we wanted to execute 3 tasks:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
use std::thread;

pub fn main() {
    thread::spawn(move || make_coffee_task());
    thread::spawn(move || waste_time_task());
    let handle = thread::spawn(move || do_laundry_task());
    // wait for other threads to complete
    handle.join().unwrap()
}

fn make_coffee_task() {
    // Mocha time!
    println!("Coffee done");
}

fn do_laundry_task() {
    // Mostly gym clothing so it should be quick!
    println!("Laundry done");
}

fn waste_time_task() {
    // Checks twitter...
    println!("Done wasting time!");
}

It might looks like this at the Operating System (OS) level 1:

During the execution of these threads, any of them can become blocked ( unable to continue to do work ) or they could just voluntarily yield ( programmer knew the thread will have nothing to do for some time under certain condition ), at that point the OS will have to save the state of the running thread (so that it can resume running at a later point) and select another to start or continue running. This is called thread-context switch.

For a variety of reasons2, this context switching has some amount of overhead that penalizes performance. Additionally, the approach doesn’t scale well since the more tasks we want to run at the same time the more threads we’ll need (eventually running out of resources).

Async

With Asynchronous code instead of using one thread per tasks, we can run multiple tasks concurrently on the same OS thread.

Because multiple tasks are running on the same OS thread, the issue of thread-context switching is greatly improved (less threads to switch around). Furthermore, since we’re reusing the same OS threads for multiple tasks, we end-up using significantly less resources.

For both of those reasons our Async counterpart has the potential of being much faster:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34

use futures; // Using futures = "0.3.1"
use std::thread;

async fn async_main() {
    let coffee_future = make_coffee_task();
    let laundry_future = do_laundry_task();
    let waste_future = waste_time_task();

    // Running the futures concurrently within the same thread
    futures::join!(coffee_future, laundry_future, waste_future);
}

fn main() {
    futures::executor::block_on(async_main());
}

async fn make_coffee_task() {
    // Mocha time!
    println!("Using thread Id: {:?} ", thread::current().id());
    println!("Coffee done! \n");
}

async fn do_laundry_task() {
    // Mostly gym clothing so it should be quick!
    println!("Using thread Id: {:?} ", thread::current().id());
    println!("Laundry done! \n");
}

async fn waste_time_task() {
    // Checks twitter...
    println!("Using thread Id: {:?} ", thread::current().id());
    println!("Done wasting time! \n");
}

This code is meant to act cooperatively in a single thread.

In the next part we’ll be exploring the Future trait and the surrounding Async programming ecosystem further.

Footnotes: