I found a Loop for
with a somewhat different syntax, I tried to find out how it worked but I couldn't, it gave me an error.
I began to investigate but I did not find much information, according to its syntax it is an for
"asynchronous" loop or something like that, this would be the loop for
I am talking about:
async for target in iter:
block
Well my question is what is this loop for for
? How does it work? And under what circumstances should it be used?
I also realized that there is not only a loop for
with that different syntax( async
), there are functions, withs
and I don't know if there are more.
async def funcname(parameter_list):
pass
async with expr as var:
block
So why async
before declaring a loop for
, a function, etc?
Thanks in advance for your answers!
The subject is very complex and extensive, but I will try to give an answer.
async
is a keyword introduced together withawait
in Python 3.5 and that allows to define native coroutines.Let's imagine that we have an application that requests data from several different servers, the request to the server can take an indeterminate time to complete. Under normal conditions the interpreter executes certain orders sequentially waiting for the current one to complete before moving on to the next one, in our case this would mean waiting for the server to return before sending the request to the next one, a waiting time that is wasted.
Instead of waiting for the response, we could go on to send the request to the next server or process the return of another one in the meantime and later when the previous one returns we process its return. This way we don't have the thread doing nothing while a blocking I/O task returns. This is the essence of an asynchronous program.
Asynchronous is classically associated with multiprocessing or multithreading. Simplifying a lot, when you have multiple threads running, each CPU core can run only one thread at a time. In order to allow all threads/processes to share resources, the CPU performs an operation called a context switch. Let's say at a random interval, you save all the context information for one thread and switch to another thread, then load the context and continue with the previous thread...
Threads and processes have certain problems, two of them are that they can be heavy and expensive to deploy and they are susceptible to race conditions, since if they are not synchronized correctly, the change from one thread to another is not determined.
One way to do asynchronous programming without resorting to threads or processes is by implementing an event loop . Basically we have an event/job queue and a loop that just constantly pulls jobs from the queue and runs them. We are going to call these jobs coroutines, which would be a small set of instructions that can also include new events that must be queued again. In this case we have application controlled context switches , while waiting for some blocking I/O to complete, there are no CPU level context switches. Everything is executed in the same thread , only one routine is executed at a time andit switches context only at points that are explicitly defined (eg
await
). Coroutines therefore allow cooperative multitasking operations in which each coroutine relinquishes control at certain points and voluntarily.A few key concepts are:
yield
/yield from
initially (generator-based coroutines). Actually a coroutine is a consumer, an extension of a generator function that can generate values and accept external values . The advantage of using a coroutine is that we can pause the execution of a function and resume it later since it maintains its context. Count a coroutine is called, it doesn't actually run, but will return a coroutine object that can be passed to the event loop to run immediately or later.Future (Future) : It is like a placeholder for a value that will materialize in the future but that does not exist right now. It is similar to what in JS is known as promises. This terminology is perhaps clearer, it is essentially a promise that it will give us a return value when the asynchronous operation completes, we keep this future object or promise and when it is fulfilled we can call a method to retrieve the actual result.
Tasks (task): they are schedulers for the coroutines, wrappers for the coroutines. It is a subclass of
Future
. The gist is that they allow you to keep track of when they finish processing, other coroutines can wait for a task, and you can also grab the result of a task when it has finished processing. Tasks are therefore used to schedule concurrent coroutines.Event loop: It is the central core in
asyncio
. None of the above makes sense without it. There are several configurations and types of event loops that can be used within the asyncio module .An example basic way of proceeding would be:
await co-rutina_2
), the current coroutine is suspended and a context switch occurs. The context of the current coroutine is saved (variables, state, ...) and the context of the called coroutine is loadedAs for your three usage examples of
async
:async def funcname()
:It is the way to define a native coroutine since Python 3.5 (PEP 492), at which time native coroutines were introduced, with
async
/await
. This replaces the generator-based coroutines (@asyncio.coroutine
,yield from
,yield
), which will be removed in a future Python 3.10.As already mentioned, calling a coroutine does not execute its content, just like a generator:
async with expr as var
It is the expression to make use of the asynchronous context manager of an object that implements it, that is, it can suspend execution in its
__enter__
and methods__exit__
.async for target in iter
It is the expression to iterate over an async iterable (you can call async code in its ₎ method
__iter__
) and over an async iterator (you can call async code in its__next__
₎.async from
andasync with
can only be used within the body of a coroutine.It should be made clear that coroutines are executed concurrently, but never in parallel. In CPython, threads cannot execute in parallel either because of the GIL, so coroutines are alternatives to consider in many cases in which we could use threads in situations where we have blocking input/output operations (typically a network request). They provide a very high level of concurrency with very little overhead and memory usage.