Sinisterly
Asynchronous Web Scraping with aiohttp - Printable Version

+- Sinisterly (https://sinister.ly)
+-- Forum: Coding (https://sinister.ly/Forum-Coding)
+--- Forum: Coding (https://sinister.ly/Forum-Coding--71)
+--- Thread: Asynchronous Web Scraping with aiohttp (/Thread-Asynchronous-Web-Scraping-with-aiohttp)



Asynchronous Web Scraping with aiohttp - vluzzy - 01-02-2024

import aiohttp
import asyncio

async def fetch(url):
async with aiohttp.ClientSession() as session:
async with session.get(url) as response:
return await response.text()

async def main():
urls = ['https://example1.com', 'https://example2.com', 'https://example3.com']
tasks = [fetch(url) for url in urls]
results = await asyncio.gather(*tasks)
print("Results:", results)

if __name__ == '__main__':
asyncio.run(main())