In this lab you will build a simulated concurrent URL fetcher using asyncio. Since we can't make real HTTP requests in this environment, you'll use asyncio.sleep() to simulate network latency.
Your Task
Implement the following in the code editor:
1. fetch(url, delay) coroutine
Takes a URL string and a delay (simulated latency in seconds)
Awaits asyncio.sleep(delay) to simulate network I/O
Returns a string in the format: "OK: {url} ({delay}s)"
2. fetch_all(urls) coroutine
Takes a list of (url, delay) tuples
Uses asyncio.gather() to fetch all URLs concurrently
Returns the list of results from gather
3. main() coroutine
Defines a list of at least 4 URLs with different delays (between 0.1 and 1.0 seconds)
Calls fetch_all() and stores results
Prints each result on its own line
Prints a summary line: "Fetched {n} URLs concurrently"
4. Entry point
Call asyncio.run(main()) at the bottom
Requirements
All functions must use async def
Concurrency must be achieved with asyncio.gather()
Do NOT use threading or time.sleep()
The total runtime should be close to the maximum individual delay, not the sum