I've been a viewer of ThePrimeagen for quite some time now, and occasionally he mentions his favorite interview question, Async Request Queue
I may not be a lawnmower or a genius like Tom, but I, for one, am a senior software engineer at some startup, and I am very confident I can solve this problem.
First, let's identify the criteria for this question:
- It's a queue, a first in, first out data structure
- You enqueue a promise factory
- You must be able to return the promise when you enqueue
- There can only be 3 maximum running task at the same time
Now let's go over it one by one, note that we're writing this in everyone's favorite language:TypeScript
🤮
- The ideal data structure for FIFO is a linked list, but linked list doesn't exist in TS so we have to write our own.
- We're going to make a class
AsyncRequestQueue
that has a methodenqueue
and some private method#next
,#run
. - In JS, a promise means it's already executed and there is no way to hold it, that's why we have to wrap it into a function
() => Promise<T>
before passing it toenqueue
. - Last item can be solved easily by simply book keeping. We'll just take
maxRunningTask
as parameter to our constructor and also addpending: number
property to classAsyncRequestQueue
.
Our scuff TS LinkedList
We only need these 2 methods (pushFront
, popBack
) for this, I chose this naming scheme from Rust's LinkedList because JS's Array.shift
sounds stupid.
// linked-list.ts type LinkedListNode<T> = { value: T; next?: LinkedListNode<T>; }; export class LinkedList<T> { head?: LinkedListNode<T>; tail?: LinkedListNode<T>; length: number; constructor() { this.length = 0; } pushFront(value: T): T { const node: LinkedListNode<T> = { value }; if (!this.head || !this.tail) { this.head = node; } else { this.tail.next = node; } this.tail = node; this.length++; return node.value; } popBack(): T | undefined { if (!this.head) { return undefined; } const node = this.head; this.head = this.head.next; this.length--; if (!this.length) { this.tail = undefined; } return node.value; } }
Async Request Queue
Now I'm not gonna explain this line by line. Chances are, you are a programmer, and you know how to code. If not, then why the hell are you even here?
// async-request-queue.ts import { LinkedList } from "./linked-list"; export class AsyncRequestQueue { queue: LinkedList<() => Promise<void> | void>; maxRunningTask: number; pending: number; constructor(maxRunningTask: number) { this.queue = new LinkedList(); this.maxRunningTask = maxRunningTask; this.pending = 0; } async enqueue<T>(task: () => Promise<T>): Promise<T> { return new Promise<T>((resolve) => { this.queue.pushFront(() => { this.pending++; try { const result = task(); resolve(result); } catch (e) { console.log(e); } finally { this.#next(); } }); this.#run(); }); } #next() { this.pending--; this.#run(); } #run() { if (this.pending < this.maxRunningTask && this.queue.length > 0) { const task = this.queue.popBack(); task?.(); } } }
Running the code
// index.ts import { AsyncRequestQueue } from "./async-request-queue"; async function delayByMs(ms: number, id: string): Promise<number> { console.log("queued", id, ms); return new Promise((resolve) => { setTimeout(() => { console.log("finished", id, ms); resolve(ms); }, ms); }); } async function main() { const q = new AsyncRequestQueue(3); q.enqueue(() => delayByMs(1000, "a")); q.enqueue(() => delayByMs(3000, "b")); q.enqueue(() => delayByMs(1100, "c")); q.enqueue(() => delayByMs(10000, "d")); const weCanAwait = q.enqueue(() => delayByMs(3000, "e")); q.enqueue(() => delayByMs(3000, "f")); console.log("awaited e", await weCanAwait); } main().then();
That's it, it should output the following:
queued a 1000
queued b 3000
queued c 1100
queued d 10000
queued e 3000
queued f 3000
finished a 1000
finished c 1100
finished b 3000
finished e 3000
awaited 3000
finished f 3000
finished d 10000
Now that's how you do it, but... In an interview scenario, you'll probably want to suggest improvements or the interviewer might ask it to you. Here are a few improvements I can think of:
- Abort signal, in this scenario there would be a race condition against running tasks.
- EventEmitter, you can extend
class AsyncRequestQueue
with Node.js'class EventEmitter
so that clients can subscribe to events that you'll implement. - Priority Queue.
- Dequeue (we did not implement it in
class AsyncRequestQueue
).
Repo
Self Plug
Hi gamers! I'm Jasper, a senior software engineer with more than 5 years of dev experience. I primarily code in TypeScript 🤮, I know a bit of Rust, and I type on a Lily58
, a column staggered split keyboard. I use arch and code in neovim btw