LongCut logo

Python Tutorial: AsyncIO - Complete Guide to Asynchronous Programming with Animations

By Corey Schafer

Summary

Topics Covered

  • Async Doesn't Mean Faster
  • Create Tasks for Concurrency
  • Blocking Code Kills Concurrency
  • Task Groups Over Gather
  • Profile to Pick Async Strategy

Full Transcript

Hey there. How's it going everybody? In

this video, we're going to be learning all about Async.io in Python. Async.io

is Python's built-in library for writing concurrent code. And it can seem a bit

concurrent code. And it can seem a bit intimidating at first with all the different terminology and moving parts.

But by the end of this tutorial, I'm hoping that you'll have a solid understanding of how it works and when to use it. Now, in this video, I'm going to be covering a lot. We're going to be learning how Async.io actually works

under the hood. We'll see visually what's happening with some animations that I've put together. Uh we'll see how to update an existing codebase to use async.io. We're going to be able to

async.io. We're going to be able to determine when and where to use async.io with some simple profiling. And we'll

discuss when to choose async versus threads versus multiprocessing. And the

code in the animations that I'm going to use in this video are going to be available on my GitHub and website after this is out. So, if you all want to follow along with those yourselves, then I'll leave links to those in the

description section below. Now, I'm

going to get to some code as soon as I can, but there's a few basics that we have to get out of the way first. So,

async.io is a Python library for writing concurrent code using the async await syntax. And I want to mention that we're

syntax. And I want to mention that we're using the latest version of Python in this video. And I'll be teaching the

this video. And I'll be teaching the current way to do things. Async.io has

evolved quite a bit over the years. Uh

there have been different ways of running the event loop, scheduling and running task and all kinds of different changes. Uh but we're going to focus on

changes. Uh but we're going to focus on the modern ways that you should be using today. Now before we dive too deep,

today. Now before we dive too deep, let's talk about what concurrency actually is. So with synchronous code

actually is. So with synchronous code execution, uh which is what we normally write, one thing happens after another.

So, it's kind of like going to a Subway restaurant where uh you put in your order and they make your entire sandwich from start to finish uh before moving on to the next customer. But with

concurrent code, it's more like going to a McDonald's where someone just takes your order and then moves on to the next customer while your food is being made

in the background. And that difference when applied to code can be really confusing at first for a lot of people.

So here's something that I think is important to understand pretty early on.

So asynchronous doesn't automatically mean faster. It just means that we can

mean faster. It just means that we can do other useful work instead of sitting idly by while waiting for things like network requests and database queries

and stuff like that. That's why async IO excels at what's called IObound tasks which are anytime your program is waiting for something external. Now,

Async.io is singlethreaded and runs on a single process. It uses what's called

single process. It uses what's called cooperative multitasking where tasks voluntarily give up control. For

CPUbound tasks that need heavy computation, you'd want to use processes instead. And we'll see how to tell the

instead. And we'll see how to tell the difference between IObound and CPUbound here in a bit. But before I lose your attention with too much theory early on, let's go ahead and jump into some code

and we'll learn the basic terminology that we need to know as we go. So, let

me walk through this code and explain the terminology as we go. Now, the

terminology is what really trips a lot of people up when they first start learning async.io because there's quite a bit to remember. Now, right now we have some asynchronous code here that

we'll walk through and explain, but for now you can see that we have a simple synchronous function right here at the top and that just sleeps for a bit and

then it returns a string. We're running

this synchronous function inside of our main function here. Our main function is asynchronous and you can see that it has this async keyword. Since it's an

asynchronous function, we can't just call it directly. Uh in order to run our main function, we have to start what is called an event loop. And we're doing

this down here at the bottom with async io.run and passing in uh that main async

io.run and passing in uh that main async function there. So the event loop is

function there. So the event loop is basically the engine that runs and manages asynchronous functions. Think of

it a bit as auler. It keeps track of all our tasks and when a task is suspended because it's waiting for something else uh control returns to the event loop

which then finds another task to either start or resume. So we have to be running an event loop for any of our asynchronous code to work. That's what

this async io.run function is responsible for. It's getting the event

responsible for. It's getting the event loop running task until they're marked as complete and then closing down the event loop whenever it's done. So let me go ahead and run the code that we have

right now. And we can see that we're

right now. And we can see that we're just printing out that this is a synchronous function. And then we get

synchronous function. And then we get that result. So we came in here, ran

that result. So we came in here, ran this event loop. This is an asynchronous main function. And we are just calling

main function. And we are just calling this synchronous code here within our async function for now. But we don't want to just run synchronous functions inside of our event loop. We want to use

concurrency. to use concurrency. We're

concurrency. to use concurrency. We're

going to be seeing this await keyword a lot. So, let me uncomment the futures

lot. So, let me uncomment the futures code here uh so that we can talk about awaitables. So, let me uncomment this

awaitables. So, let me uncomment this here. So, you can see here that we're

here. So, you can see here that we're using this await keyword. So, awaitables

are objects that implement a special await method under the hood. You're

going to see await everywhere in asynchronous code. an object has to be

asynchronous code. an object has to be awaitable for us to use that keyword on it. Now, why can't we await a

it. Now, why can't we await a synchronous function uh like this sync function here or something like timesleep? Well, synchronous libraries

timesleep? Well, synchronous libraries don't have a mechanism to work with the event loop. They don't know how to yield

event loop. They don't know how to yield control over and resume later. So,

basically synchronous code like time.sleep sleep or our synchronous

time.sleep sleep or our synchronous function here. Uh they don't have that

function here. Uh they don't have that uh underlying special await function that they need in order to pause their execution and start back later. Uh these

things need to be coded in to be compatible with async io. And that's why we can't await time. And we need to use something like async io.sleep instead.

And to use this await keyword, we also have to be within a function that has this async keyword. So if I remove async

from our function there, now you can see that I'm getting a warning here. And if

I hover over this, uh, it's not letting me hover right now, but we can see that my rough warning here is telling me that await should be used within an async

function. So we have to be within an

function. So we have to be within an async function in order to use these await keywords. Okay. So what does await

await keywords. Okay. So what does await do? So when you await something, you're

do? So when you await something, you're basically telling the event loop to pause the execution of the current function and yield control back to the

event loop which can then run another task and it'll stay suspended until this awaitable completes. So in Python's

awaitable completes. So in Python's async io there are three main types of awaitable objects. First there are co-

awaitable objects. First there are co- routines which are created when you call an async function. Second there are tasks and tasks are wrappers around co-

routines that are scheduled on the event loop. And the third there are futures

loop. And the third there are futures and futures are low-level objects representing eventual results. Now, if

you're coming from somewhere like the JavaScript world, uh futures are a lot like promises in JavaScript. They're a

promise of a result that will be available later. But unlike JavaScript,

available later. But unlike JavaScript, in Python, we almost never work with futures directly. We write co- routines

futures directly. We write co- routines and when we schedule them as tasks, async.io uses futures under the hood to track those results, but we won't be

seeing them much in this video. you're

really only going to use futures directly if you were writing low-level async IO code. Uh like if you were building an Async compatible framework, but just to show you what they look

like, let me run this example with uh this futures example here really quick.

And uh just so we can see what's happening. So a future's job is to hold

happening. So a future's job is to hold a certain state and result. The state

can be pending uh meaning the future doesn't have any result or exception yet. Uh it can be cancelled if it was

yet. Uh it can be cancelled if it was cancelled using future.canc

uh or it can be finished and it can be finished uh by a result being set by uh future set result or it can be an

exception with future set exception. So

you can see here after we created this future and printed that out, it says that future was pending after we created it. And then we set the result right

it. And then we set the result right here to future result test and got that result by awaiting it. And then we printed that out. But like I said, this

is lower level stuff and we won't be using it directly in this video. We're

going to be working mostly with co- routines and tasks. So I'm going to delete this future example here and then we're going to look at co-outines and

tasks. So let me uncomment the co-outine

tasks. So let me uncomment the co-outine example here. And co-outines are

example here. And co-outines are functions defined with the async defaf keywords here. So main is a co-outine

keywords here. So main is a co-outine here. We have async defaf. If I go up

here. We have async defaf. If I go up here, we have this async function and I have a comment here that says also known as a co-ine function. So we have this

async defaf async function. And within

here, it's a lot like our synchronous function, but instead of using time.sleep, sleep we are using async

time.sleep, sleep we are using async io.sleep and we are awaiting that as

io.sleep and we are awaiting that as well. So that's what this co-outine

well. So that's what this co-outine object is right here. So co- routines are basically functions whose execution we can pause and there's actually two

terms here that we need to understand.

There's the co-outine function which is what we define with the async defaf keywords and then there's the co-outine object which is the awaitable that gets

returned when you call that function. So

this is the co-outine function here and after we call that function this is the co-outine object here. So co- routines are a bit like generators in the sense

that they can suspend execution and resume later but they're designed to work with an event loop. They have extra features that Async IO needs to schedule

them, await IO, and coordinate multiple tasks. So, let me go ahead and run this

tasks. So, let me go ahead and run this here so that we can see what's happening. Now, again, if you're a bit

happening. Now, again, if you're a bit confused right now, I think all of this will make a lot more sense once we look at the animations that I've put together. Uh, but right now, we're just

together. Uh, but right now, we're just focusing on learning these terms so that we can understand a bit better what we're looking at once we get to those animations. So you can see here when we

animations. So you can see here when we executed this co-ine function, it doesn't run all of that function. It uh

didn't come in here and print out that this was a synchronous function um before we got to this line here. It just

created this co-ine object and then we printed out that co-ine object which is right here. So when we ran that co-ine

right here. So when we ran that co-ine function we got this co-outine object and to actually run this co-outine and get the result we have to await it. When

I await that co-outine object we can see that that's whenever it runs the print statement from our asynchronous function and then we got that result and printed

that out and that result was just async result test. Now when we await a

result test. Now when we await a co-artine object directly like this, it's both scheduled on the event loop and run to completion at the same time.

Okay. So now let's look at tasks. So I'm

going to comment out this co-ine section here. And now let's look at tasks here.

here. And now let's look at tasks here.

Now tasks are wrapped co-outines that can be executed independently. Tasks are

how we actually run co-outines concurrently. When you wrap a co-outine

concurrently. When you wrap a co-outine in a task using async io.create task

like we've done here, it's handed over to the event loop and scheduled to run whenever it gets a chance. The task will keep track of whether the co-outine

finished successfully, raised an error or got cancelled just like a future would. And in fact, tasks are futures

would. And in fact, tasks are futures under the hood, but with extra logic to actually run the co-outine and do the work that we want to do. That's why we

work with tasks instead of futures uh in most of our code. But unlike co-outine objects, tasks can be scheduled on the event loop and just sit there without

being run until the loop gets control.

And this is the key to async IO. You can

queue up multiple tasks at once and then the event loop will be able to run them whenever it's ready. Uh letting them take turns while waiting on IO. So let

me go ahead and run this so we can see this basic example here. So you can see that when we uh printed out the task that we created here it shows that the

task is pending. It shows us uh the name of the task here and the co- routine that it is wrapping. And when we await that task it runs that co- routine and

we get those print statements and that returned result. Okay. So that does it

returned result. Okay. So that does it for the terminology. Uh now let's see this in action with some specific examples and some animations. Uh I don't know about you all but I'm a very visual

learner. So seeing this stuff in action

learner. So seeing this stuff in action helps me a lot more than just looking at the code. So first I'll show the Python

the code. So first I'll show the Python code and then we'll see what's happening under the hood with an animation. So

here is the Python code here. Now in

this first example we're not going to be using async IO at all. This is just normal synchronous code with no event loop or anything like that. So we should

know exactly how this works. So if we go down and uh look at the code here, we are resetting setting the results equal to this main function here. And don't

worry, I have some extra timing functionality here um just so we can see how long this takes. Uh but then we're running that main function. The main

function comes in here and we are running this fetch data function with a value of one. Fetch data then comes in here, prints out that we're doing

something with one. Then it sleeps for that 1 second that we passed in. Then it

prints that we're done with one and then returns the result of one. Okay? And

then after that returns, that return value gets set to that variable there.

Then we print that fetch one's fully complete. And then we come here and do

complete. And then we come here and do result two is equal to fetch data two.

That comes up here and says do something with two. Times sleep for two. Done with

with two. Times sleep for two. Done with

two then returns result of two to this result two here. We print out that fetch two is fully complete. And then we return both of those results. So fully

synchronous code. We should know how this works. Let me go ahead and run it.

this works. Let me go ahead and run it.

And I added some timing code here that'll show us how long this took. So

we can see it ran through that code synchronously and it finished in 3 seconds. And that makes sense because

seconds. And that makes sense because we're doing fetch data for 1 second and then we're doing fetch data for 2 seconds. So 3 seconds total. So now let

seconds. So 3 seconds total. So now let me show this as an animation in the browser so that we can get an idea of what these animations are going to look like. And I've taken the timing code out

like. And I've taken the timing code out of these animation examples so that we can just pay more attention to the actual code. Okay. So, let me go to

actual code. Okay. So, let me go to example one here. Sorry about that. Now,

hopefully this text is large enough for you to see. I made this as large as I could while everything can still fit on the screen here. Now, if you're walking through these examples, uh, these animations with me by using my website

or have downloaded these yourself, then I've set this up so that the right arrow key progresses to the next steps. And

unfortunately, I didn't add uh any functionality to go backwards. So if you want uh the animation to run again, then you'll have to reload the page. Okay. So

like I said, all of this is synchronous code here. So we're just going to walk

code here. So we're just going to walk through this. It's going to uh see those

through this. It's going to uh see those functions there. Then we're going to uh

functions there. Then we're going to uh run result equals main there. It's going

to go into the main function and none of this is going to kick off anything on the event loop. We are going to run fetch data with the parameter of one.

that's going to come in and run some print statements. We're going to do a

print statements. We're going to do a time.leep. Now, time. Is going to kick

time.leep. Now, time. Is going to kick off some background IO here and sleep for one second. And it's going to stay on this line for that entire second

until that completes. Once that

completes, then we can move forward and do our other print statements here.

Return that. And then we're going to walk through do result two is equal to fetch data to. Same thing. We're

printing out some uh text there. We're

going to run this background IO. It's

going to sleep for two seconds now. And

that's going to stay there until that completes. Once that is done, then we

completes. Once that is done, then we can come in and print out our other statements. And then we return our

statements. And then we return our result one and two. And then we come down here and print the results that we got from main, which was just uh a list

of those two results. Okay, so that's synchronous code. That should be what we

synchronous code. That should be what we expect. But since it was synchronous, we

expect. But since it was synchronous, we were waiting around during those sleeps when we could have been allowing other code to run. So now we might want to

improve this performance and switch over to using async IO. So let's move on to example two here and see an example of how someone might go about doing this.

Now in this example, we're going to see what a first attempt at converting our code to asynchronous might look like.

But there's going to be a common mistake here. Uh so you can see that we've

here. Uh so you can see that we've converted our functions into co- routines. So we have uh an async defaf

routines. So we have uh an async defaf fetch data here. Our main is async defaf co-outine here. And then we are running

co-outine here. And then we are running this in an event loop here with async io.run.

io.run.

So this is what someone's first attempt might look like to go asynchronous here.

So we are setting our tasks here directly to our co-outine objects here.

Fetch data 1 and fetch data 2. Almost

like we're just calling a function. This

is very similar if we go back to example one how we were setting the result equal to fetch data 1 and fetch data 2. Uh

that's what we're doing here with these tasks. And then we're trying to get the

tasks. And then we're trying to get the result here by awaiting those co- routines directly. And then after we

routines directly. And then after we await one, we're printing out that task one is fully complete. After we await two, we're printing out that task two is fully complete. Now, we could have

fully complete. Now, we could have awaited these directly. I could have said that result one is equal to await uh fetch data one like that. uh but I

broke them up here into uh being able to see the co-outine object first and then await that separately. Okay, so let's

run this and see if this works. So we

run this code and we can see that it still takes 3 seconds. So we're not getting any concurrency benefit here at all. So why is that? Well, some people

all. So why is that? Well, some people have a misconception that when you run a co-outine function like we did here that it creates a task and schedules it. But

it doesn't. It just creates the co-outine object. So when we await that

co-outine object. So when we await that co-outine object, we're scheduling that and running it to completion at the same time. We get no concurrency here and no

time. We get no concurrency here and no benefit to using async.io. So, let me show you what this looks like in an animation that I put together, and I think this will make more sense. So,

here we have that same code that we had before. So, we're going to run through

before. So, we're going to run through this. Now, once we get to results equals

this. Now, once we get to results equals async io.run main that is going to

async io.run main that is going to create our event loop. So, now in our event loop, right now we have one co-ine. We have this main coine that is

co-ine. We have this main coine that is going to run. So now the code uh whenever I step forward here it's going to run from this co-outine here. So as I

go through this now we're saying task is equal to fetch data 1 that's going to create a co-outine object. Task two

that's going to be another co-outine object. And now when we do result one

object. And now when we do result one equal to await task one that is going to schedule that on our event loop and run it to completion at the same time. So if

I step forward here then our main co-outine is suspended. So await is what suspended our main co-ine and now our

event loop is looking for tasks that are ready. Now we just scheduled that fetch

ready. Now we just scheduled that fetch data one task uh whenever we ran this co- routine directly. So our event loop is going to see that and say okay I have

a ready task here. So, let me go in here and run this until I hit an await. So,

now it's going to come in here. It's

going to print that we're doing something with one and then we are going to await that sleep. And once we hit that, it's going to suspend our current

task. And it's going to uh be suspended

task. And it's going to uh be suspended until async.io.sleep

until async.io.sleep

is complete. So that's going to kick off our background IO here with our timer.

And then that's going to suspend. And

now that's going to stay suspended until our timer is complete. Once that timer is complete, then this is going to wake up this task and say, "Hey, this what

you were awaiting here is complete. So

now you're going to be ready to run again." So now that completed, now we're

again." So now that completed, now we're ready to run again. Our event loop is going to go through. see that we have a ready task here and then it's going to go back in and it's going to continue

running from where it left off. So then

we're going to go through here, print out these other statements and finally we're going to return here and once we return now this task is complete. this

fetch data one task is complete and that is going to wake up our main co-outine here because now this task one that we

were waiting for is now complete. So now

main is ready to run again. Our event

loop is going to see that. It's going to come in here and run it and now print out task one fully complete. And now

we're going to do the same thing with await task two. It's going to suspend our main co-outine. It's going to uh schedule our task two here onto our

event loop and also run it to completion. So, we're coming in here and

completion. So, we're coming in here and we are printing these out. This await

async io.sleep here is going to suspend this task until this async.io.sleep is

done. So, it kicks off that timer. It

suspends our task. That timer eventually is going to complete. Once that's

complete, then our task can wake up here. And now it's ready to run. The

here. And now it's ready to run. The

event loop is going to see that. Run it

where it left off at this await statement. And then we're going to walk

statement. And then we're going to walk through our other print statements here.

Hit our return statement. And that's

going to complete that task. And now

since task two is complete, and that's what we were awaiting there. Now our t now our main co-outine is ready to run again. So now our event loop's going to

again. So now our event loop's going to see that we have a ready task here, come in and finish printing these out. And

then finally it will return that main co- routine and close all of that down.

And then back here in our main Python code, uh, we get those results and we can print those out. So I hope that that made sense. Uh, let me reload this

made sense. Uh, let me reload this really quick. I won't go through the

really quick. I won't go through the entire animation again, but the one thing that I really want you to catch on to here is that whenever we created

these co-outines here, um they are not scheduling any tasks on our event loop.

They are just returning a co-outine object and then here when we await those co-outine objects directly, we are both scheduling them and running them to

completion at the same time. Uh so that is why we are not getting concurrency.

We only have one task down here that is ever running at a time. So uh basically we have the same performance that we had

when we ran our synchronous code. So let

me go back to our code here. And now

let's look at example three. And this is going to be a look at one of the correct ways to run asynchronous code. Now the

only thing that we've changed here from the previous example is that now we're we are creating tasks from these

co-outines using async io.create task

instead of just calling those co-outines directly. Now when we create a task it

directly. Now when we create a task it schedules a co-outine to run on the event loop. This is the part that we

event loop. This is the part that we were missing from the previous example.

So now if I run this then we can see that in our output that do something with one and do something with two ran

one after another without the first task finishing and before task one was able to print out that it was done or uh task one was fully complete. It immediately

came in here and said okay I'm going to do something with one. I'm going to do something with two. And then since our uh number one task only slept for 1

second, it completed first. And then our second task completed um after that since it's sleeping for 2 seconds. And

we can see here that the total time of our script here uh took 2 seconds in total. And that is because it ran both

total. And that is because it ran both those at the same time. And that total time is simply how long the longest running task was, which was two seconds.

So we did get concurrency here. So let's

see this in an animation uh so that we can see exactly what that looks like.

Okay. So now here we can see that I have the code that uh creates the task here.

So let me walk through this and again we're going to get to this line here where we are running our event loop and we're going to run this main co-outine.

So that main co- routine is now running on our event loop. And then when we get to this point here where we create this task with fetch data one that is going

to schedule that task on the event loop.

So now we can see that that task is now scheduled and ready on the event loop.

Uh and now in our main co-outine here we're still going forward. And now with task two with async.io.create create

task that is also going to schedule that fetch data too on our event loop. So now

we can see that that gets scheduled as well and both of those are ready. And

now when we get to this line here result one equals await task one. This await is going to yield control over to the event

loop and it's going to suspend our main co-outine here until this task one is complete. So if we go forward with that,

complete. So if we go forward with that, our main co- routine suspends here. And

now our event loop is going to look for any ready task. It's going to see that we have fetch data one ready. So it's

going to come in. It's going to run this. And then it's going to hit an

this. And then it's going to hit an await statement here. And now it's going to suspend uh this task until async.io.

So it's going to kick off that async.io.

sleep here in the background. And now

it's going to suspend that task. Now,

this is where we get concurrency since we had both of these scheduled. Now, our

event loop is going to keep looking for tasks that are ready. It's going to see that we have this fetch data 2 here. And

now, it's going to come in and run this.

So, we're going to do our print statements here. We're going to hit our

statements here. We're going to hit our await, which is going to suspend our fetch data 2 co-ine. And it's going to be suspended until async.io

sleep is done. And that's going to be the one for two seconds. So, it's going to kick that off. Then, it's going to suspend. And now you can see that this

suspend. And now you can see that this is the concurrency here. We have both of these timers running here in the background. So, these are all going to

background. So, these are all going to stay suspended until something gets finished. So, our first timer completes,

finished. So, our first timer completes, it's going to wake up our first task here. And now that that first task is

here. And now that that first task is ready, our event loop is going to find a ready task. And then it's going to pick

ready task. And then it's going to pick up where it left off and just print that we are done with one and then return that value. And now once that is

that value. And now once that is complete, remember that our main co-outine is awaiting that task one. So

as soon as this task one is complete, then our main co-outine is now ready to run again. So it's going to come in here

run again. So it's going to come in here and print out that task one is fully complete. We're going to await our task

complete. We're going to await our task two. Now, task two is already suspended.

two. Now, task two is already suspended.

So, there's nothing really to do here other than to wait for this timer to finish here. Once that timer is

finish here. Once that timer is complete, it's going to wake up this fetch data 2 task here. Now, our event loop is going to see that that's ready,

come in here where it left off, and print out that we're done with two.

Return the results. And now that this is complete, our task two, it's going to tell our main co-outine that it's ready to run. Our event loop is going to see

to run. Our event loop is going to see that and run where that left off. Print

that task two is fully complete and return a list of those results. And then

that event loop is going to close down.

We have those results there. We can

print those out and we have everything there. So I hope that that example makes

there. So I hope that that example makes sense and now it makes sense why uh this code worked here by scheduling these tasks ahead of time uh instead of

whenever we awaited these co-outines directly because when we awaited these co-outines directly it didn't get scheduled until we hit this await statement. So we only had one task

statement. So we only had one task scheduled and run fully to completion here. Uh in our second example, we had

here. Uh in our second example, we had both of these tasks scheduled. And then

when we awaited, it was able to uh run our task one until it hit an await. And

then once we hit that await, then it was able to go through and see that we had another task that was ready and scheduled and it could run that as well.

Okay, so I hope that that makes sense.

Uh the more examples that we see, I think the more clear that this is going to be. Now in our fourth example here,

to be. Now in our fourth example here, now I want to show you uh an example of something to show you something important about awaiting tasks and how

things are actually run on the event loop here. So I haven't changed much

loop here. So I haven't changed much with this code here. It's basically all the same. But all I did here uh from

the same. But all I did here uh from example three uh in example three I'm awaiting task one first and then printing out that task one is fully

complete. In example four, I am setting

complete. In example four, I am setting result two and I'm awaiting task two first and printing out that task two is fully complete and then I am awaiting

task one and saying that task one is fully complete. Now, what do you think

fully complete. Now, what do you think is going to happen here when I run this?

So, some people might think that we're going to run task two first and then task one or maybe run them both at the same time, but that task two will

complete first. But let's see. Let's go

complete first. But let's see. Let's go

ahead and run this and see what happens.

So, when I run this, then our results might be a little confusing to some people. So, we still finished in 2

people. So, we still finished in 2 seconds. So, it's still running

seconds. So, it's still running concurrently. But if you look at the

concurrently. But if you look at the output, task one still runs first.

There's no change there. The only

difference is that it didn't move to our task two fully completed uh print statement here until task two was completely done. So that's what I want

completely done. So that's what I want you to take away from this specific example is that when we await something, we're not guaranteeing that we run that

particular part right at that moment. uh

the event loop is going to run whatever is ready. What we are guaranteeing is

is ready. What we are guaranteeing is that we're going to be done with what we awaited before moving on. And actually,

it doesn't even need to be one of these tasks that we await. Uh so, for example, I could use async.io.

Instead, and that would also yield control to our event loop, and those tasks would still run in the same order.

Uh, and it would just wouldn't move on until our async io.sleep is done. So,

let me do that and just show you what I mean. So, I'm going to await async.io.

mean. So, I'm going to await async.io.

Since our longest sleep is 2 seconds, then I'll just sleep here for 2.5 seconds. So, if I run this, then it's

seconds. So, if I run this, then it's going to be the same output pretty much.

Uh, except now we don't have a second result because await async.io. Just

returns none. So our result two there is none. Uh and it finished in 2.5 seconds

none. Uh and it finished in 2.5 seconds since that's now a the uh longest sleep that we have. But we can see that the order of execution basically remains the same. It still did something with one

same. It still did something with one first then two got done with one got done with two and then once this awaited

async iosleep was finished here that is when we moved on to printing out that task two is fully complete there. So

that is what I wanted to show you with that example. Let me also show you this

that example. Let me also show you this here in an animation just to really hammer that point home. So I'll start stepping through the first part of this pretty quickly now. Uh so we're going to

get down to where we run our event loop with that main co-outine and then that is going to create our first task there with task one. It's going to schedule that. Our task two with create task is

that. Our task two with create task is going to get scheduled as well. And this

time we are waiting task two first. Now

when we hit await, what it's going to do is it's going to suspend this coine until task two is done. So we're

suspending and we're yielding control back over to the event loop. The event

loop is going to go and see what tasks are ready. Now this uses a FIFO Q in the

are ready. Now this uses a FIFO Q in the background, which is first in, first out. That's not super important, but uh

out. That's not super important, but uh uh what is important is just to know that um what you're awaiting isn't always going to be the first thing that gets run. It's just going to be whatever

gets run. It's just going to be whatever the event loop has ready. So right now it is this fetch data task one here. And

that is going to run until it hits its await statement and suspends itself and kicks off that background sleep. And now

we have another task ready here. The

event loop's going to see that going to come in until that hits its await statement. It's going to kick off that

statement. It's going to kick off that background sleep and it's going to suspend itself until one of these timers is done. Then this timer is done here.

is done. Then this timer is done here.

It's going to wake up our first task here. And now this is where something a

here. And now this is where something a little different happens. Uh that was different than our previous example. So

it's going to see that this is ready.

It's going to come in and pick up where it left off here. We're going to print that we're done with one and return that result and that is going to complete.

Now before we were awaiting task one here. So before once this task one was

here. So before once this task one was done then our main co- routine was going to say okay I'm ready to run again. But

that's not what we awaited. We awaited

task two. So what this is going to do is it's just going to save that result in memory for now. And now we still just have two suspended co- routines here. So

these are going to stay suspended until this timer completes here. That

completes, it's going to wake up this second task. And then our event loop is

second task. And then our event loop is going to see that that is ready. It's

going to come in and do its print statements that it's done with two. It's

going to return that result. And now our main co-outine is going to get woken up whenever this task two is done here. So

now it is saying that it's ready. And

now we move forward with our print statements. So we're going to print that

statements. So we're going to print that task two was fully completed. When we do this await task one here, that's already been completed. So there's nothing left

been completed. So there's nothing left to do there. All it's going to do is pull that result from memory that it has saved. So it's just going to set that

saved. So it's just going to set that variable equal to what that uh return value was. We're going to print out that

value was. We're going to print out that task one is fully completed. and then

return a list of those results. Close

down the event loop and move through and print out all of those. Okay, so I hope that this is making more and more sense as we're seeing more and more examples

here. Okay, so moving on to our next

here. Okay, so moving on to our next example. Our next example here is going

example. Our next example here is going to be really important. So let me pull this one up here. Now in this example, we're going to see what happens if we

block the event loop with synchronous blocking code. So this is pretty much

blocking code. So this is pretty much the same as the examples that we've been looking at where we are creating tasks, scheduling those on the event loop and

then awaiting those tasks. But in our fetch data co-outine here, instead of using async io.sleep, I'm using

time.sleep here. Now time itself isn't

time.sleep here. Now time itself isn't awaitable. So I can't await that here.

awaitable. So I can't await that here.

If I put in await, then that's just going to throw an error. Um, but what we can do is we can run this inside of an asynchronous function like fetch data

and we can schedule that on our event loop and we can await that co-outine.

But this is bad practice here and we're going to see exactly why. Uh, because

like I was saying before, time.leep

isn't awaitable and it wasn't coded to know how to suspend itself and yield control over to the event loop. But what

happens if we put that blocking call there and then schedule and run that co- routine? Well, let's go ahead and see.

routine? Well, let's go ahead and see.

So, I'm going to go ahead and run this.

And we can see that it did something with one, did something with two, uh, task one fully complete, task two fully complete, and we finished in 3 seconds here. So, since we finished in 3

here. So, since we finished in 3 seconds, we know that those didn't run concurrently. We can also see that it

concurrently. We can also see that it didn't start both of these at the same time either. we came in and did

time either. we came in and did something with one and it wasn't until we were done with one that it moved on to doing something with two. And that's

because time.leep blocks the event loop.

Now, that might not be obvious and some people might think that uh just because we had that synchronous code being run inside of a task that we assume that maybe somehow it would have worked. Um,

but let me show you what's actually going on here and why this blocks. So

I'm going to pull up our fifth example here and let's go ahead and run through this to see what happens. So I'm just going to go down to the part where we

start our event loop and run that main co-outine there. And now within the

co-outine there. And now within the event loop, we are scheduling our task here and creating those. And now we're getting to the point here where we are

awaiting task one. So that's going to suspend our main co- routine and it's not going to pick back up on our main co-outine until task one is complete. So

I'll go ahead and move forward here.

That's going to suspend. Our event loop is going to find a task that is ready to run. It's going to come in here into

run. It's going to come in here into fetch data one and it's going to run down through this. We're going to print that we're doing something with one. And

now we're going to get to this time.

Now here it's going to kick off that background IO here. But what's going on is that we never awaited here. So this

um task never got suspended. So it's

just going to sit here until this blocking code is done until this background IO is done. So what we have here is a blocked event loop. So

eventually that sleep is going to complete. Um there's nothing to wake up

complete. Um there's nothing to wake up here because we're still just um running synchronous code. So now that that's

synchronous code. So now that that's complete, we can run forward with our task here. Say that we're done with one.

task here. Say that we're done with one.

Return that result. Our main co-outine is going to wake up here. Now again,

like I was saying before, even though our main co-outine is now ready because that task one was complete, that doesn't necessarily mean that that is the next

thing that the event loop is going to run. Uh the event loop uses that FIFO

run. Uh the event loop uses that FIFO first in first out Q in the background.

And since task two has been ready for a while and was ready before main became ready again, then it's going to actually find this task two first. And this is

where my animation is lacking a bit because it doesn't show that FIFO order of the ready Q. Uh but that's what's going on there. So it's going to see that our task two is ready here. We're

going to come in and run this. We're

going to print out a couple of things.

We're going to get to this time. That

timesleep is going to kick off our background IO. Um, but it does not know

background IO. Um, but it does not know how to suspend itself. Uh, so it's just going to hang here until this sleep is complete. Once that sleep is complete,

complete. Once that sleep is complete, then this can continue on and print that we're done with that. It's going to return that result. Now, our main

co-outine here has been ready for a while. our event loop's going to see

while. our event loop's going to see that. It's going to come in and

that. It's going to come in and print out the rest of these print statements. Close down the event loop

statements. Close down the event loop and then we will print out our results there. Now, let me restart this

there. Now, let me restart this animation really quick. And I'm just going to go back to a certain part of this animation. It's where we completed

this animation. It's where we completed our first task here. And now both of these tasks are ready. And like I said, this is a FOQ where it's going to find

this first this uh task number two first and run this. Now, you might be thinking that I should be emphasizing the order in which these run a little bit more and

kind of explain that a little bit more, but to be completely honest, you don't really want to get bogged down in what exactly is running on the event loop at

any one time or what's going to be next or anything like that because async.io, So it's really meant we're going to be running you know tens possibly even

hundreds of things concurrently and the event loop is going to handle everything that is ready. Uh when we run real asynchronous code it's not going to be

cut and dry examples like we have here where we know exactly when most tasks will be finished and when others will be ready. Um, so we don't have control over

ready. Um, so we don't have control over that and we shouldn't uh want to have control over that. The event loop is just going to do its job. What we do

have control over is not moving forward until something is done. So if I um, you know, didn't want this to move forward

until task two was done, then I would put an await task two there. Um, but in terms of whether this goes back and runs the main co- routine or fetch data

first, that shouldn't really matter that much to us. Uh, let the event loop run whatever tasks are ready. And if we

really want to enforce uh, anything, it'll be that we're going to enforce exactly when something is done, not, you know, exactly whenever it gets its turn

in the event loop. So, I hope that that makes sense. Um, I just kind of thought

makes sense. Um, I just kind of thought of that as I was walking through that example uh last time. Um, but with that said, let me go ahead and go back to our

other examples here. So, this example five, the one that we just saw where we have this time. Uh, blocking our event loop, this is exactly what happens when

we run any blocking code in our asynchronous functions. So, while we're

asynchronous functions. So, while we're using time.sleep asleep in this example.

using time.sleep asleep in this example.

This could easily be any other code.

This could be uh request.get making a web request or any other synchronous code. Uh because requests uh the request

code. Uh because requests uh the request library, it's not asynchronous. So to do web requests asynchronously, uh we would

need to use an asynchronous library like HTTPX or AIO HTTP. Uh but if we do have some blocking synchronous code that

doesn't have an async IO alternative then we can also use async IO to pass this off to threads or processes and the event loop will manage those threads and

processes for us and that's what we're going to see in the next example. So let

me open up example six here. Now, this

example is going to be a bit more advanced here, but I want to show this since it might be something that you'll see or even need to do when working with some asynchronous code. So, right off

the bat, let me explain a couple of the changes that I made here that are specific to this example since we're using threads and processes. So, first,

instead of running our synchronous blocking code inside of an asynchronous function, uh, which we don't want to do, I've instead just turned our fetch data

function back into a regular non async function. So, it's just a regular

function. So, it's just a regular synchronous function. And we'll use

synchronous function. And we'll use async.io to pass this regular function to a thread. And then we'll also see an example of how to pass this to a process. Now, you'll notice here that

process. Now, you'll notice here that with these print statements, I put flush equal to true argument in there. Uh,

that's just to make sure that our print statements come out in the order that we expect. Sometimes when running these

expect. Sometimes when running these outside of our current thread, uh, print statements can get buffered and come back in a seemingly weird order. So,

that's more just for the tutorial here.

Now another thing down here at the bottom where I am starting up our event loop here you'll notice that I'm also using this if name is equal to main

conditional and this is for our multi-processing example. So when Python

multi-processing example. So when Python spawns multiple processes it needs to rerun our script in that new process. So

uh this check makes sure that we don't end up in an infinite loop whenever it uh runs our code and spawns that new process. Okay. So with that said, let's

process. Okay. So with that said, let's see how we can do this here. So we still have our asynchronous uh main function here. And this is going to look very

here. And this is going to look very similar to what we were doing before. Uh

except when we're creating our task, we're simply wrapping our fetch data function here inside of this async io.2thread

io.2thread function. Uh this will wrap our synch

function. Uh this will wrap our synch synchronous function with a future and make it awaitable. Now you'll notice that I didn't execute the synchronous

function. I didn't say you know fetch

function. I didn't say you know fetch data with parenthesis here and pass in that one. I don't want to do that. You

that one. I don't want to do that. You

want to pass in the function itself and its arguments separately uh to this async io.2thread function so that it can

async io.2thread function so that it can execute later when it's ready. Then

we're just awaiting this just like any other task. Now for processes here this

other task. Now for processes here this is a little bit more complicated. So

first we have to import this process pool executor up here from concurrent.futures

concurrent.futures and then we have to get the running loop uh because we are using this loop.runinexecutor

loop.runinexecutor method here. So here we're just saying

method here. So here we're just saying loop is equal to async.io.getrunning

loop and then within this process pool executor we are creating these tasks with loop.runinexecutor run an executor

with loop.runinexecutor run an executor passing in this process pull executor here. And again, just like before, we're

here. And again, just like before, we're passing in the function that we want to run in a process and the arguments separately. And just like with our

separately. And just like with our threads, that's going to wrap that new process in a future that we can then await. And once we've done that, uh,

await. And once we've done that, uh, we're simply awaiting those like we did before. So let me go ahead and run this

before. So let me go ahead and run this and see what we get.

So we can see that that works and that these ran concurrently. Uh it took 4 seconds, actually a little bit over 4 seconds because threads and processes have a bit of overhead to spin up and

tear down. Now the reason it took 4

tear down. Now the reason it took 4 seconds and not 2 seconds is because we ran these in two different groups of two. We ran uh both of our tasks in

two. We ran uh both of our tasks in threads and then we ran both of our tasks in processes there. And since the longest task is 2 seconds, uh we ran our

threads took 2 seconds there running concurrently. And then our processes

concurrently. And then our processes took 2 seconds running concurrently as well. So just like we've been doing so

well. So just like we've been doing so far, uh let me pull this up here in the browser and run through this in an animation just to really knock this point home. Uh, this code is a little

point home. Uh, this code is a little bit longer here. So, we can see that some of this gets cut off. Um, but let me go ahead and run through here. I'll

scroll down to where we can see that we're starting up this event loop with that main co-outine. Okay. And now we come in here. We are creating this task.

We're passing off fetch data with an argument of one to a thread and we're creating a task out of that. So, that

gets created and scheduled. We're doing

the same thing with fetch data and an argument of two there that gets scheduled on our event loop. And now

when we await task one, it's going to suspend our main code routine here until that task one is complete. So now it's going to find our thread here. Now I

don't have the code here for our thread.

And the reason I'm not showing our synchronous code in this task is because that code isn't running in our current thread anymore. Uh that's going to go

thread anymore. Uh that's going to go off and run in its own thread. So

eventually uh what that's going to do is that task is going to hit and await um something that this async io.2 thread

puts into place for us. And then it's going to kick off that background thread and run our synchronous code for us. So

this thread gets started here running that synchronous fetch data code. It's

going to suspend that task and then we might see some print statements coming in here while that thread is running that synchronous code. But now our event

loop is free to move on to our other task here. It's going to run our other

task here. It's going to run our other bit of synchronous code there in another thread and suspend this second task here. And now both of these threads are

here. And now both of these threads are going to be running in the background.

And then eventually that is going to complete. This thread is going to be

complete. This thread is going to be done. It's going to notify our twothread

done. It's going to notify our twothread task here. And that's now going to be

task here. And that's now going to be ready. Once that is ready, then it's

ready. Once that is ready, then it's going to return and complete. And then

it's going to wake up our main co- routine here. Since we were awaiting

routine here. Since we were awaiting that task one, it's going to print that that's fully completed. We're going to move on to waiting for that task two thread to be done. So again, we've seen

all this before. That completes. That's

ready. That completes. That's ready. So

a lot we're doing a lot of the same stuff here. So I'm going to keep going

stuff here. So I'm going to keep going through this a little bit faster now because we're kind of should be kind of used to this as we go. Now we're

awaiting task one. Task one's going to come in here. Run. That's going to be a process in the background IO here. It's

still printing stuff out out here. We're

running another process in the background IO. Eventually, that's going

background IO. Eventually, that's going to complete.

That's going to complete there. Our main

co-outine is going to pick back up, do some print uh do some print outs there, wait for that second task to finish there. Once it finishes, it wraps up and

there. Once it finishes, it wraps up and completes. And then we move on with

completes. And then we move on with printing out all of our code here. Now,

I know that that example was a bit more complicated, but I wanted to show that because if we're using async io, there might be times when we don't have an asynchronous option in order to get

concurrency, and we'll need to run some blocking code in threads or processes.

Uh, but now let's go back to our more standard use cases here. So this is our last example here before I get on to a uh real world example and we can uh see

how to update a real codebase to asynchronous. So with example seven here

asynchronous. So with example seven here in this example we're going to see other ways that we can schedule and await tasks. So so far we've been creating

tasks. So so far we've been creating tasks and uh one at a time and awaiting them manually. But a lot of times we

them manually. But a lot of times we might want to create a bunch of tasks and run them all at once. We can do this with either gather or with task groups.

So here our first section here uh I've just taken out some print statements along the way. But our first section here is basically what we've been doing.

We're creating these tasks manually.

They get scheduled then we await them manually and get those results. Now our

next example here uh what we're doing is we are creating and awaiting a bunch of co-outine objects in a list and then we

are awaiting those with async io.gather.

So what I have here is just a list comprehension. We're saying that we want

comprehension. We're saying that we want fetch data for i in range of 1 to three.

So that's still only going to give us two there. Fetch data one and fetch data

two there. Fetch data one and fetch data two. And then we are passing those in to

two. And then we are passing those in to async io.gatherather. And then we are

async io.gatherather. And then we are awaiting that gather. And I'm going to go over these in more detail in just a second. But let's move on to look at

second. But let's move on to look at this task group here as well. Oh, I'm

sorry. This isn't the task group. Uh

this is another gather here. Um in this one instead of creating a list of co-ines like we did up here in this one we are creating a list of tasks and

those tasks are just wrapping those co-ines there we can gather those as well. So I here I have a list of co-

well. So I here I have a list of co- routines that I'm passing in to gather.

Here I have a list of tasks from those co- routines that I'm passing in to gather. um down here with the task

gather. um down here with the task group. With the task group here, I'm

group. With the task group here, I'm just saying uh task group as TG and then within the task group, we're doing a

list comprehension here of tg.create

task on that uh co- routine there and we're getting those results. Now, we're

going to look at all these more in depth here in just a second, but let me go ahead and run these and we can see the output here before we go over this code

a little further. Okay, so this took 8 seconds total. Now, these did all run

seconds total. Now, these did all run concurrently. We ran these in four

concurrently. We ran these in four different groups and each group took 2 seconds. So, that's why it took 8

seconds. So, that's why it took 8 seconds total. 2 seconds for four

seconds total. 2 seconds for four groups. Now, we've already seen our

groups. Now, we've already seen our manual task creation up here plenty of times so far. So let's skip over that and let's go to our async.io.ather here.

Now when it comes to gather, you'll notice that these asteris that I'm using here before our list. Now what this is

doing is it's unpacking our list. Uh now

gather doesn't take a list as an argument. So unpacking it with this

argument. So unpacking it with this asterisk is basically the same as passing all of these u items from this list in individually. Now for these two

gather examples here, one of them I'm passing in co- routines directly and the other one I'm passing in tasks. Now you

can pass in co- routines directly if you just want to get the results. Uh but

remember that tasks add some extra functionality. So if you want to monitor

functionality. So if you want to monitor or interact with the tasks in any way before they complete then you'd want to use tasks. But if you just want the

use tasks. But if you just want the results then it's fine to just pass in pass in uh that list of co-ines. Now for

our task group here, uh this is the first time that we've seen an async context manager. Now just like

context manager. Now just like functions, context managers can be async when they need to do IO operations during setup or tearown. That's why we

have async with here. So task group does a lot of async work for us uh when entering and when exiting. So, it tracks tasks, it waits for completions, uh,

handles cancellations, handles errors, stuff like that. Now, the main thing that you'll notice is that we're not awaiting anything with the task group.

We're not awaiting these results anywhere, and we're not awaiting them once we get out of that context manager either. It awaits all the tasks that we

either. It awaits all the tasks that we create for that task group for us when it exits this context manager. And we'll

see another example of an async context manager here in a bit. Uh that isn't a task group. Um they can be used for

task group. Um they can be used for things that need to set up and tear down uh you know for network requests, file access, database operations, all kinds

of stuff. Now you might be wondering

of stuff. Now you might be wondering which ones you use here. Should you use gather for a bunch of different tasks or should you use a task group? Now, I tend

to use task groups a lot of the time, but basically the key difference between gather and task groups is how they handle errors. So, with async io.gather,

handle errors. So, with async io.gather,

if you use the default of return exceptions is equal to false, which I honestly wouldn't really recommend. And

as a matter of fact, I didn't realize that I uh didn't have return exceptions equal to true here in gather. Um so I would always recommend if you're going

to use gather to use this return exceptions equal to true. The default is false. But with that set to false, if

false. But with that set to false, if one task fails, then it raises the first exception that it saw. Uh you don't get a bundle of errors or any of the

successful tasks. And if it fails, other

successful tasks. And if it fails, other tasks won't be cancelled. So you risk having orphaned task. Task group, it

also fails quickly, but it gives better a uh better errors and handles cleanups a bit better. So I wouldn't really use gather with its default argument of

return exceptions equal to false. But it

does have a good use case for return exceptions equal to true because if any task does fail with return exceptions equal to true, it still runs the other

task for you. Every awaitable in that gather finishes whether it succeeds or fails. So then your result is just a

fails. So then your result is just a list where each position is either the result or uh of the success or the exception of the failures. Um, this is

what you want to use if you want to run all the tasks even if some error out.

Like maybe you have a bunch of URLs that you want to crawl. Uh, but you don't want all of them to fail if only one of those URLs fails and gets hung up or

doesn't exist or something like that.

For that, you would use async.io.

Now, with async.io.task task group here it also on the first failure uh it cancels all the other tasks. So if it fails it raises an exception group

containing all exceptions from the failed tasks and that would include exceptions from canceled tasks as well.

Now there's no option to keep running other tasks after one fails. So we use this when we want all our tasks to run successfully. So basically to sum all

successfully. So basically to sum all that up, if you want tasks to continue running even if some fail, then you would use gather with return exceptions

equal to true. Now, but if you want all of the tasks to either fail together or succeed together, then you would use a task group. And I almost never would

task group. And I almost never would recommend using uh gather with the default of return exceptions equal to false. If somebody knows of some edge

false. If somebody knows of some edge cases that I'm not thinking of, then feel free to uh leave me a comment. But

um you know, if that's the case, I would uh if you want it to fail fast, then I almost always use a task group instead.

Okay, so just like with the other examples, let's look at this example in the browser here to see what's going on under the hood. Uh but I'm going to run through this animation a little faster

because it's going to be very similar to the examples that we've already seen. Uh

the tasks are going to be scheduled and awaited in different ways than we've seen before, but the behaviors are basically the same. Um so let's step through this pretty quickly here. We're

creating a task. It gets scheduled.

Creating another task that gets scheduled. We are awaiting our main co

scheduled. We are awaiting our main co routine here and suspending it. Running

our first task that suspends. Our second

task suspends after we kick off both of these background IO tasks here. And

actually, so I'm not scrolling here, let me make the screen a little bit smaller.

I know that's going to be harder to read. Uh, but basically, we just want to

read. Uh, but basically, we just want to see these animations anyway. Um, the

code is the same as it was in our example here. Um, but

example here. Um, but this is the examples that we've seen already where we're creating these manually. That's why I'm stepping

manually. That's why I'm stepping through these pretty quickly. Okay, so

now we are getting to our gather examples. So first we're creating this

examples. So first we're creating this list of co-outines here and that's not going to schedule anything on our event

loop and now we are here where we are gathering all of those and we are awaiting that gather. Now since these are co-ines these are going to get

scheduled and run to completion at the same time. So that is going to suspend

same time. So that is going to suspend our main co-ine and schedule all of those uh co-ines that were in this list.

So now just like we saw before, whoops, let me get that there. Uh just like before, it's going to run through since we have both of these scheduled on our

event loop. It's going to run these

event loop. It's going to run these concurrently until these complete and return and complete and return. And once

all of those are done, then it's going to uh tell our main co- routine here that it's ready and it's going to go ahead and move forward. Now with our

gather task, this is a little bit uh different because we are creating tasks here in a list instead of creating co- routines in a list. Now remember when we

create a task, it's going to go ahead and schedule those. So those get scheduled when we create those tasks and then when we await those uh that is

whenever we are going to suspend our main co-outine and our event loop can actually get around to running those.

But those are going to run concurrently like we've seen before.

And once those are done, then our await is going to go ahead and wake up our main co-outine here and go ahead and move forward. And then with our task

move forward. And then with our task group, we're going to come in here to our task group. We are going to create these tasks in our task group. So those

are going to get scheduled onto our event loop. Now there is no await

event loop. Now there is no await statement here. This automatically

statement here. This automatically awaits whenever it exits our context manager here. So once that context

manager here. So once that context manager ends, that's when we suspend our co- routine there and then our event loop can find these ready tasks and run

these.

So just like before, these are run concurrently.

And once those are done, that's whenever it can fully exit that um task group context manager there and move on to our

print statement that it has those task group results. So then we are finished

group results. So then we are finished up with our main co-outine here that completes down here at the bottom. We

are free to move on and print out our results. Okay, so that is the last

results. Okay, so that is the last animation and that is the last of our code examples. Now we can look at a

code examples. Now we can look at a realworld example here. I hope that those animations really helped visualize what's going on in the background and helped make it easier to understand how

Async IO works. Um, with my brain, I can kind of understand stuff better when it's visual like that and interactive like that. Um, but now let's look at a

like that. Um, but now let's look at a real world example and we'll see how to convert this real world example of some synchronous code and convert this over

to using async IO. So, let me open up this real world example here. And first,

we'll just uh look over what this does.

So, I have a synchronous script here that downloads some images and then it processes those images. So, let me walk through this and explain what's going on. So, I have my imports up here. I

on. So, I have my imports up here. I

have my image URLs. This is just 12 uh highdefinition pictures here. And then I have a download single image function, download images function that loops over

uh those URLs and downloads the single images for each of those URLs. We have a process single image here. And I just grabbed this offline. This is uh an

algorithm that does edge detection on photos. So it's just some photo

photos. So it's just some photo processing there. Um, we have a process

processing there. Um, we have a process images here that loops over all of the images that we need to process that we downloaded and processes one at a time.

And then we have our main method here.

And in our main method, uh, just ignore I have a bunch of timing uh, stuff here just to see how long this script takes to run. But the big part is that we have

to run. But the big part is that we have our image paths where we are downloading all of our images. And then we have our process paths where we are processing all of those images that we downloaded.

And then I have some final wrapping up here with the timing. And then I'm printing out um all the images that we download and how long everything took.

And then finally down here at the bottom I am running that main function. So, let

me go ahead and run this and we'll see how long this synchronous script takes.

So, we can see that we are downloading some images here and I might fast forward a little bit until this is done.

Okay, so I fast forwarded a bit there uh just so we didn't have to watch all of these images download and process. But

at the bottom, we can see how long this took. Um, I'm actually going to copy

took. Um, I'm actually going to copy this uh so that we can uh reference this later. I'm going to paste it into some

later. I'm going to paste it into some of my notes here off screen. But we can see that the total execution time here was about 23 seconds. Um, the time that

it took to download the images was 13 seconds, which was 5.5% of the total time when we processed our images in about 10 to 11 seconds, which was about

44% of the time. So, we're going to be able to speed this up by a lot. Uh, but

I don't know about you, but when I was first learning async.io, I didn't even really know where to start with, uh, you know, in terms of where to make changes to the synchronous script. Uh, which

parts do I make asynchronous? Do I use threads or do I use processes? How do I know? Well, here are some tips that I

know? Well, here are some tips that I can give uh to give you an idea of where to start. So first we need to determine

to start. So first we need to determine what's IObound and what's CPUbound. Like

I said before, IObound work is where we're just waiting around on external things to be done like web request, database access, uh file access, things

like that. It's often easy to guess

like that. It's often easy to guess what's IObound in your code. Uh if you know what you're looking for, you can even look for certain words. You know,

words like fetch or get or web request or database. Um those usually lean

or database. Um those usually lean towards IObound type of things. Uh

certain words like compute or calculate or process, those usually lean towards CPUbound type of tasks. Um now once you work with this stuff more and more,

that's going to become more intuitive.

But if you want to be absolutely sure, we can use actual profiling tools to see where we're spending the most of our time in our code. So I'm going to show a

profiling example using scaline, which is a really nice profiler that I like.

And we're not even going to need to add anything extra to our code. So first I'm going to install uh scaling here. And

you can install it with pip or UV. I'm

going to use UV here. Uh, so you could do a pip install scaline. I'm going to use uv and do a uvad scaline. I think I spelled that right. And once that's

installed, I'm going to run a command that will profile our code. Now, I have some snippets uh open here that I'm going to use to copy this command in uh

just so I don't mistype anything here.

So, let me copy this and then paste this in. Now since I'm using UV here, I'm

in. Now since I'm using UV here, I'm doing a UV run-m you can all you also use a python-m there. Um, so I'm running this scaline

there. Um, so I'm running this scaline module. I'm creating an HTML report here

module. I'm creating an HTML report here called profile report and we are going to profile uh my script here is called real world example sync v1. So, I'm

going to run that, and it's going to rerun our script where it's going to download uh those images and process those images again, but it's going to profile it as it goes and give us an

idea of where we're spending most of that time. And once this is done, I will

that time. And once this is done, I will go ahead and open this profile report.html in my browser. Okay, so that script finished. Uh it actually took a

script finished. Uh it actually took a good bit longer that time. That time it took 29 seconds. Uh the last time it only took 23 of the 24 seconds. Um but

let me go ahead and open this profile report that it gave us here in the browser. I will allow that. Okay. So

browser. I will allow that. Okay. So

here's our report. Let me make this uh a lot larger here so that we can uh so that you can read this. Okay. So we can see here that it breaks up the time here

between Python time, native time and system time. Now the system time is

system time. Now the system time is likely a lot of waiting on IO here. And

we can see that most of our system time is here in this download single image function here. So that is a great

function here. So that is a great indicator that we can speed up uh this download single image with async io or threads if you want to use threads also.

Now a lot of the time being spent in python is here within our process single image function. So that likely means

image function. So that likely means that that is CPUbound. Um so we'd get more of a speed up using multiple processes on that than we would from

async IO or threads. So now that we've profiled this, we have some actionable knowledge of where we can speed up our code. So let's go change these specific

code. So let's go change these specific functions. So I'm going to go back to

functions. So I'm going to go back to our Python code here. I'm going to close down our output there. And now let's go

to the top and let's import async IO.

Um, now for this first example, let's assume that I don't know about any asynchronous libraries that we can use to do these web requests. So, there are

asynchronous libraries out there that make this easier like HTTPX or AIO HTTP.

But let's say that for now I just wanted to keep using uh requests. We can see here that I'm using um a request session here and I'm using a session.get with

that request library. Now, like I've been saying, request isn't asynchronous.

So, to continue using requests, we really don't have any choice other than to use threads. Uh we saw how to do this earlier. So, let's go ahead and change

earlier. So, let's go ahead and change this to use threads and have Async.io manage those threads. So, I'm going to go up to our download single image here.

Now, the first thing I'm going to do, I don't know if these sessions are thread safe, so I'm going to remove this session as an argument here, and I'm

just going to use request.git instead of session.git there. Um, that'll have to

session.git there. Um, that'll have to create a new session every time for each of those URLs. Uh, but that's okay. But

other than that one small little change, since we're going to use threads for now, I'm not going to touch this function anymore. I'm going to leave

function anymore. I'm going to leave this as a regular synchronous function.

Um, I'm going to go to where it's calling this function to make the changes that we need. And it's calling this function in our download images function here. So within download

function here. So within download images, this is where we'll use async.io to send this off to threads. So that

means that we're going to convert this to a co-outine. And to do that, the first thing that we need to do is add async before the function here. And I'm

no longer using uh sessions since I wasn't sure if that was thread safe. So

I'm going to take out that context manager there and unindent those. And

I'm no longer passing in that session as an argument there. Okay. So now we're running this single download single image function for every URL in our URLs

list. So instead, let's send that off to

list. So instead, let's send that off to a thread. To do this, we can use gather

a thread. To do this, we can use gather or we can use a task group. I'm going to use a task group. And I'm actually going to grab a little bit of code from my snippets file here. And I'll paste this

in above uh what we have now so that we can see the difference. I just don't want to accidentally uh mistype anything and have to do a bunch of debugging in

the middle of a tutorial. So let me just copy this from my snippets here. And so

this is what we had before. And this is what we have now with this task group.

So you can see here that we are creating our task group here. And within here we're creating a bunch of tasks. And

we're saying task groupoup.create task.

And now instead of running this download single image directly, we're now sending this off to an uh using async io.2

thread. And this create task method here is going to create tasks uh for us of all those threads uh that we can await.

Now like we saw before, this task group is going to await on its own after it exits this context manager here. So

outside of this context manager uh to get the results from all of those tasks, I can just create a new list here and say task.result for task in task. So now

say task.result for task in task. So now

that we have that, I'm going to go ahead and just remove what we had before and use this new way of doing it here and save that. And now just to show that

save that. And now just to show that it's important to know when to use async.io versus threads versus processes correctly. Let me also shove our image

correctly. Let me also shove our image processing off into threads. Also, uh

judging from our profiling that we saw earlier, our processing should be CPUbound. So, we shouldn't get any speed

CPUbound. So, we shouldn't get any speed up by uh shoving these onto threads. And

I want to be able to show that. So, let

me go ahead and put those in threads just so we can show uh that it we don't get a speed up there. Um, so I'm going to go down to our process images

function here. And I'm also going to

function here. And I'm also going to make this a co- routine as well. And

just like we did for our downloads, let me go ahead and grab a small snippet here. This is going to be very similar

here. This is going to be very similar uh to our downloads code here. Again,

I'll put it right above what we had before, but we are basically doing the same thing here. We're creating a task group. within that task group, we're

group. within that task group, we're creating a bunch of tasks. And what

those tasks are uh is we are uh passing this process single image here off to a thread and it's going to do that for

every image that we have in our list of images. And then once we exit that

images. And then once we exit that context manager, uh we are just getting a another list comprehension here uh

setting the task. for every task in that task list. Okay. So now I'm going to uh

task list. Okay. So now I'm going to uh remove the old way of doing that there.

Now I can't run this quite yet. Uh since

we converted these to co- routines, we now need to await these and also run an event loop. Uh so we're calling these

event loop. Uh so we're calling these process images and this uh download images here. We are calling those within

images here. We are calling those within our main function here. Uh so within our main function, I'm also going to convert this to an async function because I

can't await those uh if I'm not in an async function. And now that we are in

async function. And now that we are in uh an async function here, instead of setting our image paths equal to download images, I'm instead going to

await download images since that's now a co-outine. And I'm also going to await

co-outine. And I'm also going to await process images as well. And lastly uh where we are running our main function down here. This is now our main function

down here. This is now our main function is now the main entry point for our asynchronous code. So we need to run

asynchronous code. So we need to run that in an event loop. So to do that I'm going to say async io.run

and I'm going to pass that in to run.

Okay. So hopefully I typed everything correctly there. Um, now let me rerun

correctly there. Um, now let me rerun this and let's see uh if we're able to get a speed up by passing all of those off to threads. So we can see that it

kicked off a lot of downloads uh really quickly and downloaded those. Uh but the processed images, it's still taking a while here. And you can see with threads

while here. And you can see with threads some of this uh output gets buffered a little weird and puts them on the same lines. But that's okay. We're not too

lines. But that's okay. We're not too concerned about uh this uh weird output up here. That's just showing us what

up here. That's just showing us what those threads were printing out with the process images and stuff. This is mainly what we're concerned with here is how long it took. So, right off the bat, you

can see that it sped up these downloads by a lot. Uh they all ran in their own threads concurrently. Um, before I wrote

threads concurrently. Um, before I wrote this down here, before our download of 12 images took 13 seconds and now it's

taking 2 seconds. And our process images last time took 10.49 seconds. Now it

took 10.63.

So it sped up our downloads by a lot, but it didn't speed up our processing, which we expected. Um, so that's good.

So it's showing us that we sped up our IObound code but not our CPUbound code.

Okay. So now let's take another look at how we can improve this code. So right

now we've just shoved everything off to threads. But you really only want to do

threads. But you really only want to do that when there's no asynchronous alternative. But for rew uh requests we

alternative. But for rew uh requests we do have async.io compatible libraries.

Uh the two most popular are HTTPX and AIO HTTP. I'm going to use HTTPX in this

AIO HTTP. I'm going to use HTTPX in this example uh because it's more similar to requests. And we've also got some file

requests. And we've also got some file reading and writing that we can pass off to an asynchronous library as well. And

for that we can use a io files package.

So let me go ahead and install these.

I'll open up our terminal again here. Um

now just like before you can use pip install or I'm using uv so I'm going to add these with uv. So, I'm going to use httpx

for the web requests and I'm going to use um a io files for the uh file IO.

So, now that we have those installed, uh let's go up here and let's update our imports. So, I'm going to uh import

imports. So, I'm going to uh import httpx.

I'm also going to import a io files and I'm no longer going to use requests. So

I'm going to get rid of requests there.

And also for our image processing, we saw that the threads didn't speed that up, which we knew was going to happen.

Um, but I wanted to show how someone might try that anyway. Uh, but instead, uh, let's pass those off to processes.

So adding additional processes for that work is how we speed up CPUbound tasks.

So to do this I'm going to import the process pool executor from uh concurrent.futures.

concurrent.futures.

So that's from concurrent.

Futures and we are going to import the process pool executor. Okay. It

automatically u sorts my imports there.

So if those jump around, don't worry about that. Okay. So now let's turn our

about that. Okay. So now let's turn our download single image uh function here into an asynchronous function that uses httpx.

Uh but first I'm going to update the download images function here first. Um

and again I'm going to grab this from my snippets file and I'll explain what we changed. Uh just so I don't mistype

changed. Uh just so I don't mistype anything here. Let me grab this from my

anything here. Let me grab this from my snippets and paste this in. and then

I'll explain what we changed here. Now,

if you remember before I passed this off to threads with the request library, I was doing something like this with requests. I was do using a request

requests. I was do using a request session which basically allowed me to reuse the same session for every download. It's going to be the same

download. It's going to be the same thing here with HTTPX and this async client. I'm going to reuse this same

client. I'm going to reuse this same client for every download. Um, so we are using this. This is a context manager

using this. This is a context manager here. Um, so we are opening this up and

here. Um, so we are opening this up and we have this asynchronous compatible client here and then I'm creating my task group and then we are creating a bunch of these tasks. We're no longer

sending it off to a thread. Now we're

just passing this in directly and now we're also accepting this client here uh which I took out of the threading example. So let me go back up here and

example. So let me go back up here and add this back in. So that was uh client

and that is going to be an httpx uh async client there. And now instead

of doing request.get

uh that's going to be a client.get.

Okay. So now we are making this download single image uh function here asynchronous. So we're going to make

asynchronous. So we're going to make that asynchronous there. And let me clean this up a little bit here. Okay.

And now again, I'm going to grab some code from my snippets file here. Now, I

know I'm doing a lot of uh copying and pasting here. Uh but this tutorial is

pasting here. Uh but this tutorial is getting pretty long already uh as it is.

So, I'm trying to use time here as effectively as I can. Um so this was uh I'm just copying and pasting some code

here because uh while this is very similar to request it's not exactly like request. So uh I want to paste this in

request. So uh I want to paste this in and I'll explain the differences here.

Okay. So the first big difference before where we were doing this um response equals this was request.get here um now

instead of that what we are doing is we're doing an await on this client.get

get here uh with that async client. And

also one other small change is that with the request library, it's allow redirects. This one is follow redirects.

redirects. This one is follow redirects.

And also down here at the bottom, we were using uh normal file writes, but now we're using these AIO files to write asynchronously. And also I believe that

asynchronously. And also I believe that this is the first time in this tutorial that we've seen an asynchronous iterator here uh where we're looping over these

chunks from our response. Now normally

with a regular iterator the four keyword here just pulls the next value immediately. But here every next chunk

immediately. But here every next chunk may involve waiting for IO like network data arriving. So Python gives us this

data arriving. So Python gives us this async 4 to handle that gracefully. And

this response aiter bytes here gives us an asynchronous iterator under the hood.

Each loop step does an await to get the next chunk. Uh that's why you have to

next chunk. Uh that's why you have to write async for chunk in blah blah blah blah blah. Um and inside that loop uh

blah blah. Um and inside that loop uh instead of using f.right

uh we are awaiting that f.ight Right?

Because now that chunk and this file here, everything here is asynchronous.

Okay. So, let me go to where we're processing our images and I'll convert that to using processes instead of threads. So, down here is where we were

threads. So, down here is where we were sending this off to threads. Now, if you remember from our examples earlier, using processes is a little more complicated. Uh, let me grab that

complicated. Uh, let me grab that snippet here so we can see what this looks like. This is the last snippet uh

looks like. This is the last snippet uh that I have in my file here. So, it's

the last one that I'm going to be grabbing. So, let me paste this in here.

grabbing. So, let me paste this in here.

So, we saw this in one of our earlier examples, but we have to get the running loop using this async io.getrunning

loop. Then, we are using this process pool executor here. We're creating a bunch of different tasks. We're doing

loop.run run executor with this process pool executor passing in uh that process single image with the argument and we're doing that for all of the paths in our

list of paths there to process and then we are awaiting those with async io.gather and passing in all those tasks

io.gather and passing in all those tasks there. Oh, and since I explicitly gave

there. Oh, and since I explicitly gave that uh hint earlier, I'm also going to do the return exceptions equal to true and follow my own advice there and do

that. Okay, so now we're passing this

that. Okay, so now we're passing this image processing off to processes instead of threads and our download images are running asynchronous asynchronously on our current thread

using async IO compatible libraries.

Nothing else should need to change in our main function. So let's go ahead and run this and see what our speed up is now. This should go uh much much faster

now. This should go uh much much faster here. Okay, so that was obviously much

here. Okay, so that was obviously much faster. Uh you can see here that we uh

faster. Uh you can see here that we uh ran this basically in 4.9 to 5 seconds in total. Uh compared to earlier, I

in total. Uh compared to earlier, I wrote that down. Our total execution time was 23.54 seconds. Um our downloads are now down

seconds. Um our downloads are now down to 1.6 6 seconds. Before that took 13 seconds and our processing only took

3.25 seconds when before that took uh 10 12 seconds. So, a big speed up on

12 seconds. So, a big speed up on everything there. Now, like I said, I

everything there. Now, like I said, I know that this video is getting long, but one more thing before we end here. I

want to show you some very quick things that we can add in here that you'll likely use at some point in your actual code. Now, it's nice that we've sped

code. Now, it's nice that we've sped this up so drastically, but we're being a little greedy here in our script. So,

right now, we're only grabbing 12 URLs, so it's not a huge deal to run all of these concurrently at once. Uh, but you might get into a situation where you're downloading thousands of URLs or trying

to do thousands of things concurrently at the same time. If that's the case, then you might want to add some limits in place so that you're not kicking off all that stuff at the same time. It can

bog down our own machine and it can also hammer servers. So, one easy way that we

hammer servers. So, one easy way that we can do this is to use something called a semaphore. And for our processes, we can

semaphore. And for our processes, we can easily get our CPU count um of our current machine to limit the number of processes that can be spun up at a time.

Uh so, let me add both these to our code. Uh, first let me go up here. Let

code. Uh, first let me go up here. Let

me go up to the top here. I'm just going to import OS here so that we can get our CPU count. And now at the very top of

CPU count. And now at the very top of our code, I'm going to add in a couple of of constants here. I'm just going to call one of these download limit. And

I'll set this equal to four. So we can do four concurrent downloads at the same time. I'll create another one here

time. I'll create another one here called CPU workers. And I'll set this equal to OS. CPU count. Now to use a semaphore, I'm going to go update our

download images function here. Um, right

before we open up our async client, I'm just going to say that our download semaphore is equal to an async io.

And we're just going to set that equal to that download limit. And our download limit was a constant of four. Oh, and

just so I don't screw up later, let me fix that typo. That is a semaphore there. And now I'm going to pass this in

there. And now I'm going to pass this in as an argument to our download single image function here. So I'll just pass this in as the last argument there. It's

giving me an error because we're not accepting that right now. Let me go up here and accept that. I'll just call

this a simore. And then to type in that, I'll say that that is an async io.

Okay, clean that up. And now to limit this to four downloads, I'm just going to take everything in this function here and I'm going to say async with that

semaphore argument there. And I am just going to put everything here inside of that. And that's going to use that

that. And that's going to use that semaphore which will limit the maximum concurrent downloads to four at a time.

Now for our processes, uh this one is actually the easiest part for once. Um

so all I have to do is pass in our maximum number of workers here uh to this process pool executor. So I'll say max workers and I'll set that equal to

the CPU workers constant that we set there at the top which was equal to the uh CPU count on our machine. So with

those changes uh we're not being so greedy anymore. So even if we had

greedy anymore. So even if we had thousands of images to download and process, we're still doing them quickly but uh not trying to spin up everything

all at once. So now if I save this and run it. Oops, I have a uh mistake here.

run it. Oops, I have a uh mistake here.

Let me go see what I spelled incorrectly here. Oh, you guys probably saw that I

here. Oh, you guys probably saw that I need an equal sign there instead of a minus sign. I hit a typo there. Okay, so

minus sign. I hit a typo there. Okay, so

now let me rerun this again. And now we can see whenever we first kicked that off, all four of these started at once and then it waited for one of these to

come back before it started another one.

So we have four concurrent downloads going on at a time. Um, but that is the max is we're uh not sending off more requests until one of those four are

done. So overall, our speed is going to

done. So overall, our speed is going to be slightly slower than before. Uh, but

you're definitely going to want to use these limits uh like this once you get into doing a lot of work at once just to be easy on your machine and also to be

easy on other servers. We still have a huge speed up here. um not as much as a speed up as just blasting everything off all at once, but we're using some smart

limitations here and also getting a big speed up. So, it's the best of both

speed up. So, it's the best of both worlds. Okay, so we're just about

worlds. Okay, so we're just about finished up here. Uh we covered a lot in this video, so let's start to wrap things up a bit. So, first I'd like to reiterate some common pitfalls that

you'll run into when you first start working with Async IO. One mistake that many people make that you can see from time to time is forgetting to await

their tasks or their co- routines. Um,

just to show a very quick example of this, let me go back to one of these examples here. And so, for example,

examples here. And so, for example, let's say that we don't await these tasks. Okay. Whoops. Let me delete that.

tasks. Okay. Whoops. Let me delete that.

Let me run this code here really quick.

Now we can see that we got some print statements and stuff like that. Uh but

we're not getting any actual errors. Uh

but if you look closer then you'll notice that these tasks didn't actually run. You can see that they are printing

run. You can see that they are printing out when we got the results to these that these tasks were canceled. Uh but

if we didn't have a return statement here and I was to run this now we have a bunch of print statements with no errors at all and it looks like things were kind of successful. But if we look

closer, we'll see that uh these tasks never actually ran and it just got to the start of those and they never finished. So, let me undo the changes

finished. So, let me undo the changes that we made to that. Now, another issue that you might run into that's pretty common is that a script can end before

tasks are complete, which you likely don't want. Um, so here where we are

don't want. Um, so here where we are awaiting our task two, let me change this and instead of awaiting task two, I'll just await an async io. And I'll

just do this for 0.1 seconds. Now that

task two normally takes 2 seconds to complete. If I run this here, then we

complete. If I run this here, then we can see that we didn't get any errors here. Uh, but we got done with our first

here. Uh, but we got done with our first task, but we never got done with our second task. It says that task two was

second task. It says that task two was fully complete. That's just because I

fully complete. That's just because I still have this print statement here.

But what we were awaiting was this async.io.

async.io.

If we actually look at our results here, we can see that we have a result of one.

Uh, but we do not have a result of two there. So you want to make sure that all

there. So you want to make sure that all of your tasks are uh awaited and that you don't have any still running uh before your script is completely done.

Now of course another one of the most common mistakes is accidentally using blocking IO calls within async code. Uh

we saw this several times throughout the video. Um so how do we avoid some of

video. Um so how do we avoid some of these issues? Well, one thing that helps

these issues? Well, one thing that helps is a good llinter. Uh my rough llinter settings point out a lot of async IO uh mistakes for me. If you haven't watched

my video on rough uh and want to see how I have mine set up, then I'll leave a link to that video in the description section below. And also if you set uh

section below. And also if you set uh debug equal to true here, debug is equal to true with this async io.run, run then

that will help you uh find a lot of things that uh might be wrong in your async code as well. But you just want to make sure that you have that debug um set to true only whenever you are

working in development. And lastly, just to reiterate when to use async IO uh versus threads versus processes. You'll

usually want to use async IO when you have some IO bound work that you want to run concurrently. As long as there are

run concurrently. As long as there are asynchronous libraries uh to do that in async IO, then I would recommend doing it that way. If there aren't

asynchronous libraries available and you have to use synchronous code for your IObound work, that's when you'd want to use threads. And multiprocessing is what

use threads. And multiprocessing is what you'd reach for in order to speed up CPUbound work. And if you're unsure of

CPUbound work. And if you're unsure of what is what, then you can reach for a profiler like we did earlier uh to see what's going on under the hood and where

your code is spending much of his time and um and what that time consists of.

So there's more and more async.io compatible libraries popping up all the time. Uh and the more libraries that are

time. Uh and the more libraries that are built on top of it, uh the more it'll allow us to use it in our own code. Uh

we have web frameworks like fast API, HTTP clients like HTTPX and AIO HTTP. Uh

there are database drivers. Um SQL

alchemy supports asynchronous now and there are asynchronous drivers for Postgress uh MySQL and others. Um

there's also the AIO files for file IO and uh many more out there. So it's

definitely a growing ecosystem. But with

that said, I think that's going to do it for this video. Hopefully now you have a good idea of how to use async.io in Python. Uh even if we didn't get into a

Python. Uh even if we didn't get into a specific example of what you were hoping to see, like maybe asynchronously connecting to a specific database that you use or something like that. Um I

hope that you now feel like you at least have more of a grasp on how this works under the hood and also have a mental checklist of how you'd prepare using Async.

um and some ideas for how to use this in your own projects. I'm going to be putting all of the code and animations from this video on GitHub and I'll also add the animations to my website and

I'll put links to those pages in the description section below. In a future video, I'm going to be showing how to use Fast API, which is a really fast web framework that leverages async.io and

let you use the async await patterns that we've learned here uh to build web APIs that can handle many requests concurrently. So hopefully this tutorial

concurrently. So hopefully this tutorial will allow you to fully understand what's going on there uh when we see that in action. But if anyone has any questions about what we covered in this video, then feel free to ask in the

comment section below and I'll do my best to answer those. And if you enjoy these tutorials and would like to support them, then there are several ways you can do that. The easiest way is to simply like the video and give it a thumbs up. Also, it's a huge help to

thumbs up. Also, it's a huge help to share these videos with anyone who you think would find them useful. And if you have the means, you can contribute through Patreon or YouTube. And there

are links to those pages in the description section below. Be sure to subscribe for future videos. And thank

you all for watching.

Loading...

Loading video analysis...