How to Make 2500 HTTP Requests in 2 Seconds with Async & Await
# DISCORD (NEW): https://discord.gg/C4J2uckpbR
This is a comparison about how to use Async and Asynio with AIOHttp and Python vs using threads and concurrent futures to best understand how we could make several thousand http requests in just a few seconds. Learning how to do this and understanding how it works will help you when it comes to running your own servers and web services, and stress testing any API environments you offer.
# https://github.com/jhnwr/non-blocking-requests/tree/master
# text articles from: https://realpython.com/
Support Me:
# Patreon: https://www.patreon.com/johnwatsonrooney (NEW)
# Scraper API: https://www.scrapingbee.com/?fpr=jhnwr
# Proxies: https://proxyscrape.com/?ref=jhnwr
# Amazon UK: https://amzn.to/2OYuMwo
# Hosting: Digital Ocean: https://m.do.co/c/c7c90f161ff6
# Gear Used: https://jhnwr.com/gear/ (NEW)
————————————-
Disclaimer: These are affiliate links and as an Amazon Associate I earn from qualifying purchases
————————————-
by John Watson Rooney
linux http server
You do not need to explicitly create task on line 18
John, your channel is absolutely fantastic! congratulations!
Every devops developers nightmare.
How can we apply same thing in PHP? Currently I am using CURL Multi but it boost up my server utilization. So is there any alternative way to do same in PHP?
is the normal request library a blocking library?
can i do the same things with Flask?
I was looking for a good explaination of the difference between async and multithreading…so threading is for doing things in parellel and async is for waiting for future tasks to complete but dosn't stall the current program?
Well, Python 3.13 will bring news related with sub-interpreter which improves the performance(hopefully) even with GIL once real multithreading will be possible by default(finally). What I've been reading in these days is I/O bound problems is better solved under 'asyncio' library or 'threads'. For all the rest, use 'multiprocess' library. Bit. recently I've checked out joblib library. It's really simple and fits well as alternative to threads in Python. But, there is the cost of add dependency. Even though, I recommend it!
While using executors, if the capacity of tomcat to process is set to 200 only. What happens after 200 request? Will we have to wait till 200 request are processed by the server? Or, it will pick up some request when call passes from server to db.
Why does the threaded version take so much longer than the async one, when it should be sending all the requests simultaneously as well? The synchronous one on the other hand waits for each response before it sends the next request. You could have a version that doesn't use any parallel programming facilities in Python but still works in parallel by having the async / parallel stuff handled by a Python package, or handled in an underlying C library. Eg. if you used the (much lower level) socket module and just repeatedly called send (), your requests would all go into the send buffer and the OS would send them while your Python program is waiting or doing something else.
for me, thread pool executer is best to go. using it from last few years and always gives best results. easy to handle in python standalone scripts or even in python websites with celery.
Can you please help on how the asyncio would work in AWS Lambda?
that's 2499 requests
I can't believe how fast you got to the point. thank you for reading the room
This method puts a lot of pressure on the machine, and to my use case, I want to send more than 170 thousands requests, and when I use ThreadpoolExecutor, I can divide requests to lesser group using max workers' argument. Can that be done using Async & Await?
Thank you John
work in parallel with python threads? nope, GIL.
Very well explained
Thanks!
Thanks for the video, makes me appreciate go routines and their simplicity even more 🙂
More polite title: How to do 2500 handshakes in 2 seconds and disappear.
Next video: How to make 2500 people shake hands with each other in 2 seconds, for fun!
require 'popcorn' // btw
How to send million request?
"how to build a ddos attacker…"
In this example when we send 2500 async requests , does each request have the same session or each api call had its own session ?
ThreadpoolExecutor is slower than Thread but saves memory. You don't need to use asyncio for waiting. You can use joins, barriers, wait groups and mutex locks with Thread to achieve the same. It comes down to preference, although asyncio is more streamlined, and thread is better suited for manual optimization that requires utmost speed and care.