Understanding Python's async/await: A Practical 10-Minute Guide
You've seen async def and await scattered through Python codebases, but every time you try to use them something breaks β a coroutine never runs, a blocking call freezes everything, or you get a cryptic error about running a loop in an already-running loop. Async Python is genuinely confusing at first, but the underlying model is simple once you see it clearly.
This guide skips the theory-first approach and gets you writing working async code fast.
What you'll learn
- What the event loop actually does and why it matters
- How coroutines differ from regular functions
- Running multiple async tasks concurrently with
asyncio.gather - When async helps β and when it hurts more than it helps
- The most common mistakes and how to avoid them
Prerequisites
You need Python 3.7 or later and a basic comfort with functions and modules. No prior async experience required. The examples use only the standard library β no third-party packages needed.
The problem async solves
Consider a script that fetches data from three different APIs. With normal synchronous code, each request waits for the previous one to finish before starting. If each request takes two seconds, your total wait is six seconds.
The insight behind async is that while you're waiting for a network response, your CPU is doing absolutely nothing useful. Async programming lets Python hand control back to the event loop during that wait, so it can start the next request immediately. You're not doing three things at once β you're waiting for three things at once, which is a completely different situation.
This is the key distinction: async is about I/O-bound concurrency, not CPU parallelism. If your bottleneck is disk reads, network calls, or database queries, async can help significantly. If your bottleneck is number-crunching, you want multiprocessing instead.
Coroutines: async functions are not regular functions
When you prefix a function with async def, Python turns it into a coroutine function. Calling it does not execute the body. It returns a coroutine object that needs to be awaited or scheduled.
import asyncio
async def greet(name):
print(f"Hello, {name}")
# This does NOT print anything:
greet("Alice") # returns a coroutine object
# This runs it:
asyncio.run(greet("Alice"))
The asyncio.run() call is the entry point for most async programs. It creates an event loop, runs the coroutine until it finishes, and then closes the loop. You should call it once, at the top level of your program β not inside another async function.
What await actually means
The await keyword does two things: it tells the current coroutine to pause until the awaited thing completes, and it hands control back to the event loop so other tasks can run in the meantime.
import asyncio
async def fetch_data(label, delay):
print(f"{label}: starting")
await asyncio.sleep(delay) # simulates a network call
print(f"{label}: done after {delay}s")
return f"{label} result"
async def main():
result = await fetch_data("Task A", 2)
print(result)
asyncio.run(main())
Here asyncio.sleep() is an async-aware sleep that yields control back to the event loop during the wait. This is what separates it from time.sleep(), which blocks the entire thread. Always use asyncio.sleep() inside async code; time.sleep() will freeze your event loop.
Running tasks concurrently with asyncio.gather
Running one coroutine at a time gives you clean code, but it doesn't save you any time. The real benefit appears when you run multiple coroutines concurrently using asyncio.gather().
import asyncio
import time
async def fetch_data(label, delay):
print(f"{label}: starting")
await asyncio.sleep(delay)
print(f"{label}: done")
return f"{label} result"
async def main():
start = time.perf_counter()
results = await asyncio.gather(
fetch_data("API one", 2),
fetch_data("API two", 1),
fetch_data("API three", 3),
)
elapsed = time.perf_counter() - start
print(f"All done in {elapsed:.2f}s")
print(results)
asyncio.run(main())
This completes in roughly three seconds (the slowest task), not six (the sum). asyncio.gather() schedules all the coroutines as concurrent tasks, collects their return values in order, and returns them as a list. If any coroutine raises an exception, the exception propagates and the others are cancelled by default.
Using asyncio.create_task for more control
When you need to start a task and continue doing other work before collecting results, asyncio.create_task() gives you more flexibility than gather.
async def main():
task_a = asyncio.create_task(fetch_data("API one", 2))
task_b = asyncio.create_task(fetch_data("API two", 1))
# You can do other work here while tasks run
print("Tasks are running...")
result_a = await task_a
result_b = await task_b
print(result_a, result_b)
A task created with create_task() starts running immediately (on the next event loop iteration). Awaiting it later just waits for it to finish and retrieves the result.
Writing your own async functions correctly
A common misconception is that making a function async automatically makes it non-blocking. It doesn't. If the body of your async function does synchronous work, it blocks the event loop just like any other code.
import asyncio
import time
async def bad_example():
time.sleep(3) # This blocks the ENTIRE event loop for 3 seconds
return "done"
async def good_example():
await asyncio.sleep(3) # This yields control to the event loop
return "done"
If you need to call a blocking function (like a CPU-heavy computation or a synchronous file operation), run it in a thread pool using asyncio.to_thread() (Python 3.9+) or loop.run_in_executor(). This offloads the blocking call to a worker thread so the event loop stays responsive.
import asyncio
def blocking_task(n):
# Simulate something CPU-heavy or a legacy blocking library
total = sum(range(n))
return total
async def main():
result = await asyncio.to_thread(blocking_task, 10_000_000)
print(result)
asyncio.run(main())
Common pitfalls
Forgetting to await a coroutine
If you call an async function without await, you get a coroutine object, not a result. Python will usually warn you with a RuntimeWarning: coroutine '...' was never awaited message, but the bug can be subtle if you don't notice the warning.
Mixing sync and async code carelessly
Libraries like requests are synchronous. Calling requests.get() inside an async function blocks the event loop. Use an async-native HTTP library like httpx or aiohttp instead, which provide proper async interfaces.
import asyncio
import httpx
async def fetch_url(url):
async with httpx.AsyncClient() as client:
response = await client.get(url)
return response.status_code
async def main():
status = await fetch_url("https://httpbin.org/get")
print(status)
asyncio.run(main())
Calling asyncio.run() inside a running loop
Jupyter notebooks run their own event loop. Calling asyncio.run() inside one raises a RuntimeError. In Jupyter, you can await coroutines directly at the top level, or use the nest_asyncio package as a workaround. In regular scripts, stick to a single asyncio.run() at the entry point.
Using async where it adds no value
If your function does purely CPU-bound work with no I/O at all, making it async provides zero benefit and adds overhead. Keep pure computation functions as regular def functions and call them normally (or offload them to a process pool if they're slow).
When to use async: a quick decision guide
| Situation | Recommendation |
|---|---|
| Multiple network/API calls | Use async β big wins here |
| Database queries (with async driver) | Use async |
| File I/O | Use asyncio.to_thread() |
| CPU-heavy computation | Use multiprocessing instead |
| Single network call, simple script | Sync is fine β don't complicate it |
| Web framework handlers (FastAPI, etc.) | Use async for I/O, sync for CPU work |
Async in a FastAPI context
If you use FastAPI, you've probably seen async route handlers. FastAPI supports both sync and async routes, and choosing correctly matters.
from fastapi import FastAPI
import asyncio
app = FastAPI()
@app.get("/data")
async def get_data():
# Async: good for awaiting DB queries or external API calls
await asyncio.sleep(0.1) # Replace with real async DB call
return {"status": "ok"}
@app.get("/compute")
def compute_result():
# Sync: FastAPI runs this in a thread pool automatically
result = sum(range(1_000_000))
return {"result": result}
FastAPI automatically runs synchronous route handlers in a thread pool, so they don't block the event loop. You only need async def when you're actually awaiting something inside the handler.
Wrapping up
Async/await in Python is a focused tool: it shines when you're waiting on I/O and want to keep your program busy in the meantime. Once you internalize that model, the syntax stops feeling arbitrary.
Here are concrete next steps you can take right now:
- Replace
time.sleep()calls in any existing async code withasyncio.sleep()and verify the event loop no longer blocks. - Pick one place in a project where you make multiple sequential HTTP requests and rewrite it using
asyncio.gather(). Measure the difference. - If you use
requestsin async code, swap it forhttpxwith its async client β the API is nearly identical so the migration is low effort. - Read the
asynciomodule documentation onTaskGroup(Python 3.11+), which provides a cleaner structured concurrency model than rawgather. - If you're building an API, take a look at FastAPI β its async-first design makes everything covered here click in a real application context.
π€ Share this article
Sign in to saveadmin
Writer at Bitsfolio. Passionate about Python, Data Analytics, and making complex tech topics accessible.
View all articles βRelated Articles
Comments (0)
No comments yet. Be the first!