03 - JavaScript Runtime β
1. Event Loop (Macro vs Microtasks) β
What: JS is single-threaded. The event loop is the mechanism that handles async operations by queuing work and executing it when the call stack is empty.
βββββββββββββββββββββββββββββββββββββββββββββββ
β Call Stack β
β (executes one function at a time) β
βββββββββββββββ¬ββββββββββββββββββββββββββββββββ
β when empty, check:
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββ
β Microtask Queue β β ALL microtasks drain first
β Promise.then, queueMicrotask, β
β MutationObserver β
βββββββββββββββ¬ββββββββββββββββββββββββββββββββ
β when empty, then:
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββ
β Macrotask Queue β β ONE macrotask, then back to micro
β setTimeout, setInterval, setImmediate, β
β I/O callbacks, UI rendering, MessageChannelβ
βββββββββββββββββββββββββββββββββββββββββββββββCritical rule: After each macrotask, ALL microtasks run before the next macrotask.
console.log('1'); // sync β runs first
setTimeout(() => console.log('2'), 0); // macrotask β queued
Promise.resolve().then(() => {
console.log('3'); // microtask β queued
Promise.resolve().then(() => console.log('4')); // microtask from microtask
});
console.log('5'); // sync β runs second
// Output: 1, 5, 3, 4, 2
// Sync first, then ALL microtasks (including nested), then macrotaskWhere this matters for React:
setStatebatching uses microtasks (React 18)useEffectcleanup runs synchronously before the next effect, but the effect itself is scheduled as a macrotask-like callbackflushSyncforces synchronous state update (bypasses batching)
requestAnimationFrame: Runs before the browser paints, but AFTER microtasks and before macrotasks.
[macrotask] β [all microtasks] β [rAF callbacks] β [paint] β [next macrotask]2. Task Starvation β
What: When high-priority tasks keep being added, preventing lower-priority tasks from ever running.
Microtask starvation:
function flood() {
Promise.resolve().then(flood); // endlessly queues microtasks
}
flood();
// setTimeout, UI rendering, user input β NOTHING else runs
// The browser freezes because microtasks drain before anything elseMicrotasks can starve macrotasks (and rendering) because the event loop drains the entire microtask queue before moving on.
In React concurrent mode: React's scheduler prevents starvation with time slicing. It yields to the browser every ~5ms, ensuring user input and rendering get a chance to run. High-priority updates (user input) can also preempt lower-priority renders.
In practice:
- Don't recursively queue microtasks
- Use
setTimeoutorrequestIdleCallbackfor background work - Use
scheduler.postTask()(Scheduler API) for explicit priority control
3. Priority Inversion in Async Code β
What: When a low-priority task blocks a high-priority task because the high-priority task depends on the low-priority result.
Example:
High priority: Show search results (needs data from API)
Low priority: API request to fetch data (network I/O)
The high-priority UI update can't happen until the low-priority
network request completes. The high-priority task is "inverted"
to the priority of the low-priority task it depends on.In frontend context:
// User types in search box (high priority: show input)
// Search results need API call (lower priority)
// If we await the API call in the render path, input becomes laggy
// Solution: Separate priorities
function Search() {
const [query, setQuery] = useState('');
const deferredQuery = useDeferredValue(query); // deferred = lower priority
return (
<>
<input value={query} onChange={e => setQuery(e.target.value)} /> {/* HIGH priority */}
<Results query={deferredQuery} /> {/* LOW priority, won't block input */}
</>
);
}Solution patterns:
startTransition/useDeferredValuefor UI priority- Scheduler API (
scheduler.postTask()) for explicit priority - Web Workers for CPU-bound work (completely off main thread)
4. AbortController β
What: A mechanism to cancel async operations (fetch requests, event listeners, timers, streams).
// Cancel fetch
const controller = new AbortController();
fetch('/api/data', { signal: controller.signal })
.then(res => res.json())
.then(data => setData(data))
.catch(err => {
if (err.name === 'AbortError') {
console.log('Request cancelled');
} else {
throw err;
}
});
// Later: cancel it
controller.abort();Common React pattern β cancel on unmount or dep change:
useEffect(() => {
const controller = new AbortController();
async function fetchData() {
try {
const res = await fetch(`/api/users/${id}`, { signal: controller.signal });
const data = await res.json();
setUser(data);
} catch (err) {
if (err instanceof Error && err.name !== 'AbortError') {
setError(err.message);
}
}
}
fetchData();
return () => controller.abort(); // cleanup cancels in-flight request
}, [id]);Abort with reason (newer API):
controller.abort(new Error('User navigated away'));
// err.cause contains the reasonAbortSignal.timeout (newer):
fetch('/api/data', { signal: AbortSignal.timeout(5000) }); // auto-abort after 5sAbortSignal.any (combine signals):
const timeout = AbortSignal.timeout(5000);
const manual = new AbortController();
const combined = AbortSignal.any([timeout, manual.signal]);
fetch('/api/data', { signal: combined }); // aborts on timeout OR manual abort5. Backpressure in Streams API β
What: When a data producer creates data faster than the consumer can process it, backpressure is the mechanism to slow down or pause the producer.
Producer (network) β 100MB/s β Transform β Consumer (UI render) β 10MB/s
β
Can't keep up! Memory fills up.
Backpressure needed.Streams API handles this:
const response = await fetch('/api/large-data');
const reader = response.body!.getReader();
const stream = new ReadableStream({
async pull(controller) {
const { done, value } = await reader.read();
if (done) {
controller.close();
return;
}
controller.enqueue(value);
// pull() is only called when the consumer is ready for more data
// = automatic backpressure
},
});TransformStream with backpressure:
const transform = new TransformStream({
transform(chunk, controller) {
const processed = processChunk(chunk);
controller.enqueue(processed);
// If downstream can't consume, this naturally pauses
},
});
// Pipe with automatic backpressure
response.body
.pipeThrough(transform)
.pipeTo(writableStream);Why it matters: Processing large files (CSV export on dashboard), streaming server responses (SSE for real-time transactions), large JSON parsing.
6. Streaming Fetch Response Handling β
What: Process a fetch response as it arrives, chunk by chunk, instead of waiting for the entire response.
async function streamResponse(url: string, onChunk: (text: string) => void) {
const response = await fetch(url);
const reader = response.body!.getReader();
const decoder = new TextDecoder();
while (true) {
const { done, value } = await reader.read();
if (done) break;
onChunk(decoder.decode(value, { stream: true }));
}
}
// Usage: stream large transaction list
streamResponse('/api/transactions/export', (chunk) => {
appendToTable(parseCSVChunk(chunk));
});Server-Sent Events (SSE) for real-time:
// Server sends events line by line
const eventSource = new EventSource('/api/transactions/live');
eventSource.onmessage = (event) => {
const transaction = JSON.parse(event.data);
addTransaction(transaction);
};
eventSource.onerror = () => {
eventSource.close(); // cleanup on error
};NDJSON (Newline-Delimited JSON) streaming:
async function* streamNDJSON<T>(url: string): AsyncGenerator<T> {
const response = await fetch(url);
const reader = response.body!.getReader();
const decoder = new TextDecoder();
let buffer = '';
while (true) {
const { done, value } = await reader.read();
if (done) break;
buffer += decoder.decode(value, { stream: true });
const lines = buffer.split('\n');
buffer = lines.pop()!; // keep incomplete line in buffer
for (const line of lines) {
if (line.trim()) yield JSON.parse(line) as T;
}
}
}
// Usage
for await (const tx of streamNDJSON<Transaction>('/api/transactions/stream')) {
updateDashboard(tx);
}