← Back to Design & Development
Interview Prep · JavaScript & Node.js

JavaScript & Node.js
Interview — Every Question, Told as a Story

A complete walkthrough of the questions actually asked in JS and Node.js rounds — from hoisting to the event loop, from closures to streams. Each concept is taught as a small story so the answer sticks the first time.

Foundation · Mental Model

1 · How JavaScript Actually Runs

Before any specific question makes sense, the interviewer wants to know if you have the right mental picture of what is even happening when JS runs your code. Get this picture right and half the trick questions answer themselves.

The Story Imagine JavaScript as a single chef in a tiny kitchen. There is exactly one chef (one thread), one cooking surface (the call stack), and a long line of order tickets sitting in a tray (the callback queue). Whenever the chef finishes the dish on the surface, she grabs the next ticket from the tray and starts cooking it. While a dish is on the surface, the chef cannot touch anything else — no chopping, no plating, no answering the door. That is what "single-threaded" means. The trick is that the chef has helpers outside the kitchen (the browser or Node.js APIs — timers, network, file system) who do the slow work for her, then drop a ticket back in her tray when they're done. That is "non-blocking I/O".
flowchart LR subgraph JS["JS Engine (one thread)"] CS[Call Stack
currently running code] end subgraph RT["Runtime APIs (browser / Node)"] T[setTimeout
Network · FS · Timers] end MQ[Macrotask Queue
setTimeout, I/O, UI events] MIC[Microtask Queue
Promise.then, queueMicrotask] CS -->|"async call"| T T -->|"done"| MQ T -->|"resolved promise"| MIC MIC -->|"drained first"| CS MQ -->|"one task per loop tick"| CS style CS fill:#e8743b,stroke:#e8743b,color:#fff style MIC fill:#9b72cf,stroke:#9b72cf,color:#fff style MQ fill:#4a90d9,stroke:#4a90d9,color:#fff style T fill:#38b265,stroke:#38b265,color:#fff
So what? JavaScript is single-threaded, but the runtime around it is not. The event loop is the rule that says "drain the microtask queue completely, then take one macrotask, then drain microtasks again, then repeat." Almost every async question is testing whether you know that order.
Variables · Hoisting

2 · var, let, const & Hoisting

Hoisting is the feature that surprises every beginner: variables seem to "exist" before they were declared. The interview question is rarely "what is hoisting?" — it is a code snippet where the answer depends on whether you used var, let, or const.

The Story Picture JavaScript reading your file in two passes. On the first pass, it walks through your code with a notebook, writing down every variable name it sees. var names go in the notebook with the value undefined. let and const names also go in the notebook — but with a red sticker that says "do not touch yet". On the second pass, JS actually executes the lines top to bottom, peeling off the red stickers when it reaches the declaration. That red-sticker zone — between the top of the scope and the actual let/const line — is called the Temporal Dead Zone (TDZ).
classic-trick.js
// var is hoisted with value undefined
console.log(a); // undefined  (no error!)
var a = 10;

// let / const are hoisted but in TDZ
console.log(b); // ❌ ReferenceError: Cannot access 'b' before initialization
let b = 20;

var

Function-scoped. Hoisted & initialised to undefined. Re-declaration allowed. Avoid in modern code.

let

Block-scoped. Hoisted but in TDZ until the line runs. Re-declaration in the same scope throws. Re-assignment allowed.

const

Block-scoped. TDZ. Cannot be re-assigned. The binding is constant — but if it points to an object, the object's contents can still change.

What is the Temporal Dead Zone?
The window between when a let/const variable is hoisted (it exists in the scope) and when its declaration line is actually executed. Touching it in that window throws a ReferenceError. The TDZ exists so that modern variables behave predictably — you can't accidentally read or write them before they are properly defined.
Is const truly immutable?
No. Only the binding is immutable. const arr = [1,2]; arr.push(3); works fine — the array's identity (the reference) didn't change. To freeze contents use Object.freeze(arr) (shallow) or a deep-freeze utility for nested objects.
Scope · Memory

3 · Scope & Closures

Closures are the single most-asked JS concept. They are also the foundation of currying, memoization, module patterns, and most of the React Hook tricks you'll see later.

The Story Sarah writes a thank-you note and tucks it into a small backpack. Then she gives the backpack to her friend Raj and walks away. Years later Raj opens the backpack and reads the note — Sarah is long gone, but her words are still there, exactly as she left them. A closure is that backpack. When you define a function inside another function, the inner function carries a backpack of the outer function's variables with it. Even after the outer function has returned and "walked away", the inner function still has access to those variables.
closure-counter.js
function makeCounter() {
  let count = 0;            // lives in the backpack
  return function() {
    count++;
    return count;
  };
}

const c = makeCounter();
c(); // 1
c(); // 2
c(); // 3 — count survives between calls
What is a closure, in one sentence?
A function bundled together with the variables of the lexical scope in which it was defined — so it can still read (and write) them even after the outer scope has finished executing.
Classic loop trick: what does this print?
for (var i = 0; i < 3; i++) {
  setTimeout(() => console.log(i), 100);
}
// prints 3, 3, 3 — all three callbacks share the SAME `i` (var is function-scoped)

Replace var with let and you get 0, 1, 2 — because let creates a fresh binding for each loop iteration, so each callback gets its own backpack with its own i.

Real-world uses of closures?
  • Data privacy: emulate private fields by keeping state in the closure and only exposing methods.
  • Once / memoize: cache results in a closure variable so repeated calls are free.
  • Currying / partial application: capture early arguments, return a function waiting for the rest.
  • Event handlers: remember which item was clicked without storing it on the DOM.
Memory caveat. Closures hold references to their backpack — that variable cannot be garbage-collected as long as the closure is reachable. Forgetting old closures (especially attached to DOM nodes) is the most common cause of memory leaks in long-running JS apps.
Binding

4 · The this Keyword

"What is this?" is JavaScript's most-asked, most-mistaken question. The trick is to remember that this isn't decided when the function is written — it's decided when the function is called.

The Story Think of this as the speaker pointing at the audience. The pronoun "you" doesn't have a fixed meaning — it depends on who the speaker is pointing at right now. The same way, this doesn't have a fixed meaning — it depends on how the function is called. There are exactly four "ways to call" — and each one decides what this points to.
Call styleExamplethis is…
1. Method callobj.fn()obj
2. Plain callfn()undefined in strict mode, else globalThis
3. new callnew Fn()the freshly created object
4. Explicit bindfn.call(x) · fn.apply(x) · fn.bind(x)x

Arrow functions break the rule (on purpose)

Arrow functions don't have their own this. They borrow it from the surrounding scope at the moment they were defined. That is exactly why we love them inside callbacks — no more const self = this; dance.

this-trap.js
const user = {
  name: 'Sarah',
  greet() { console.log('Hi ' + this.name); }
};

user.greet();             // "Hi Sarah" — method call
const g = user.greet;
g();                      // "Hi undefined" — plain call, this is global
g.call(user);             // "Hi Sarah" — explicit bind
Difference between call, apply, and bind?
  • call(thisArg, a, b, c) — invokes the function immediately, args passed individually.
  • apply(thisArg, [a,b,c]) — same, but args as an array.
  • bind(thisArg, ...) — does NOT invoke; returns a new function with this permanently bound. Useful for event handlers that you'll attach later.
Why does setTimeout(this.fn, 1000) sometimes lose this?
Because passing this.fn strips the method off the object — it becomes a plain function reference. When the timer fires, JS calls it as a plain call (rule 2), so this becomes global / undefined. Fix it by binding (this.fn.bind(this)) or wrapping in an arrow (() => this.fn()).
OOP

5 · Prototypes & Inheritance

JavaScript doesn't have classes the way Java does — even the class keyword is sugar over prototypes. Once you understand the prototype chain, every "why does arr.map exist?" question answers itself.

The Story Imagine a chain of librarians. You ask the first librarian for a book on "map". She doesn't have it. She points up a shelf and says "ask the librarian one floor up." That librarian also doesn't have it, but she points one more floor up. Eventually the top librarian has the book and hands it down. That is the prototype chain. Every object has a hidden link (__proto__) to a "parent" object. When you ask for a property, JS walks up the chain until it finds it — or hits null at the top.
proto-chain.js
const arr = [1, 2, 3];

// arr ─→ Array.prototype ─→ Object.prototype ─→ null
arr.map(...);     // found on Array.prototype
arr.hasOwnProperty(0); // found on Object.prototype

class is just sugar

class-vs-proto.js
class Animal {
  constructor(name) { this.name = name; }
  speak() { console.log(this.name + ' makes a noise'); }
}

// is roughly equivalent to:
function Animal(name) { this.name = name; }
Animal.prototype.speak = function() {
  console.log(this.name + ' makes a noise');
};
Difference between __proto__ and prototype?

prototype lives on functions (specifically constructor functions). It's the object that becomes the prototype of instances created with new.

__proto__ (or Object.getPrototypeOf(obj)) lives on every object. It's the actual link the prototype chain walks. So new Animal().__proto__ === Animal.prototype.

How would you implement classical inheritance without class?
function Dog(name) {
  Animal.call(this, name);          // 1. inherit fields
}
Dog.prototype = Object.create(Animal.prototype); // 2. inherit methods
Dog.prototype.constructor = Dog;             // 3. fix constructor pointer
Async · Concurrency

6 · The Event Loop & Microtasks

Every async question is the event loop wearing a different hat. If you only memorise one diagram for your interview, memorise this one.

The Story Picture a busy bartender. The bar has two ticket trays. The microtask tray sits right under her hand — every time she finishes pouring a drink, she empties this tray completely before doing anything else. The macrotask tray sits across the room — after the microtask tray is empty, she walks over, grabs one ticket, comes back, and starts pouring. Then she empties the microtask tray again. That's the loop. Promise callbacks land in the close tray (microtasks). setTimeout callbacks land in the far tray (macrotasks). That single rule explains the answer to most async puzzles.
order-puzzle.js
console.log('A');
setTimeout(() => console.log('B'), 0);
Promise.resolve().then(() => console.log('C'));
console.log('D');

// Output: A, D, C, B
// 1. A and D run synchronously (call stack)
// 2. Stack empties → drain microtasks → C
// 3. Then take one macrotask → B
What lands in the microtask queue?
Promise .then / .catch / .finally callbacks, queueMicrotask(), and MutationObserver callbacks. They run before the next macrotask, after the current one finishes.
What lands in the macrotask queue?
setTimeout, setInterval, I/O callbacks, UI events (click, scroll), setImmediate in Node.
Can a microtask flood starve macrotasks?
Yes. If a microtask schedules another microtask, which schedules another… the loop never reaches the macrotask phase and the page can freeze. This is why infinite-promise-chains are dangerous.
Async · Promises

7 · Promises & async / await

Promises gave us a cleaner way out of "callback hell". async/await then gave us a cleaner way out of .then chains. Underneath, it's all still the event loop.

The Story You order a pizza. The cashier hands you a buzzer (the Promise). The buzzer is in one of three states: pending (still cooking), fulfilled (ready, here's your pizza), or rejected (sorry, we're out of cheese). You don't stand at the counter — you walk away with the buzzer and tell it: "when you're fulfilled, do this; if rejected, do that". That's .then and .catch. async/await is the same buzzer, but you're allowed to write your code as if you were standing at the counter — JS handles the walking-away for you behind the scenes.

The four Promise combinators

Promise.all

Waits for ALL to fulfil. If any one rejects → the whole thing rejects immediately. Use when every result is required (e.g. fetch user + their posts + their settings).

Promise.allSettled

Waits for ALL to finish, regardless of success or failure. Returns an array of {status, value/reason}. Use when you want a partial result (e.g. analytics dashboard with optional widgets).

Promise.race

Settles with whichever promise settles first — fulfilled OR rejected. Useful for timeouts: race a fetch against a 5-second timer.

Promise.any

Resolves with the first FULFILLED one. Only rejects if ALL reject (with an AggregateError). Use for fallback mirrors — "give me whichever CDN responds first."

async / await vs .then chains — what's the real difference?

Behaviour-wise: nothing. Under the hood, await x is exactly x.then(result => ...rest of the function). The compiler rewrites your linear-looking function into a state machine of .then calls.

Readability-wise: huge. async/await keeps the code linear, lets you use try/catch for errors, and works naturally with loops (for...of + await).

Common mistake: awaiting in a forEach loop
// ❌ does NOT wait — forEach ignores the returned promise
users.forEach(async u => { await save(u); });

// ✅ runs sequentially
for (const u of users) await save(u);

// ✅ runs in parallel
await Promise.all(users.map(save));
What happens if you don't .catch a rejected promise?
In Node.js it triggers an unhandledRejection event. Since Node 15 the default is to crash the process (good — surfaces bugs early). In browsers it shows up in DevTools console. Always either .catch or wrap your await in try/catch.
Quirks

8 · Equality, Coercion & Truthiness

JavaScript's loose equality is the source of the language's worst-known memes. Interviewers love it because it forces you to demonstrate that you actually understand type coercion.

The Story There are two judges in the JS courthouse. Judge === is strict — same type AND same value, or you lose. Judge == is lenient — she'll convert your types behind the scenes to "give you the benefit of the doubt." Most production code uses Judge Strict because Judge Lenient's rulings are unpredictable: 0 == '' is true, null == undefined is true, but null == 0 is false. Use === by default.
coercion-traps.js
[] == ![]              // true   — !! both sides become 0
0 == ''                // true   — '' becomes 0
0 == '0'               // true   — '0' becomes 0
'' == '0'              // false  — string comparison, no coercion
null == undefined      // true   — by spec
null == 0             // false  — null only equals undefined
NaN == NaN              // false  — NaN is never equal to anything

Truthy / Falsy

Only seven values are falsy: false, 0, -0, 0n, "", null, undefined, NaN. Everything else (including [] and {}!) is truthy.

Difference between ?? and ||?

|| falls back on any falsy value. ?? falls back ONLY on null or undefined.

const port = config.port || 3000; // 0 → 3000  ❌ (user wanted 0)
const port = config.port ?? 3000; // 0 → 0     ✅
How do you correctly check for NaN?
Use Number.isNaN(x) (strict, only true for the actual NaN value). Avoid the older global isNaN() which coerces — isNaN('hello') returns true because it tries to convert the string first.
References · Mutation

9 · Shallow vs Deep Copy

The bug story behind this question is identical at every company: "I changed one item in userB and somehow userA changed too."

The Story Imagine a Google Doc shared between two people. If Raj and Sarah both have the SAME share-link, edits Raj makes appear on Sarah's screen — they share one document. That's a reference. Now imagine Sarah hits "Make a copy" — she gets her own copy, edits don't affect Raj. That's a shallow copy. But wait — the doc has an embedded spreadsheet. Sarah's copy still links to the same spreadsheet. Editing the embedded sheet still leaks across. To truly separate them she'd need to also copy the sheet. That recursive copy of every nested thing is a deep copy.
copy-tradeoffs.js
const a = { name: 'Raj', addr: { city: 'Pune' } };

// SHALLOW — only top-level keys are duplicated
const b = { ...a };
b.addr.city = 'Delhi';
console.log(a.addr.city); // 'Delhi' — leaked!

// DEEP — modern, built-in, handles cycles, Maps, Sets, Dates
const c = structuredClone(a);
c.addr.city = 'Delhi';
console.log(a.addr.city); // 'Pune' — safe
Why is JSON.parse(JSON.stringify(obj)) a bad deep clone?
It quietly drops undefined, functions, symbols. It turns Date into a string, Map/Set into {}. It throws on circular references. Use structuredClone() instead (built into modern Node and browsers).
Performance

10 · Debounce vs Throttle

Both limit how often a function runs. They're cousins but solve different problems. Interviewers love asking you to implement both from scratch.

The Story Imagine an elevator. Debounce is the elevator that waits — every time a new person presses the button, the doors restart their wait. The lift only departs once nobody has pressed for X seconds. Perfect for "wait until the user stops typing, THEN search". Throttle is the elevator on a fixed schedule — it leaves every 30 seconds no matter how many times you press. Perfect for "limit scroll handler to fire at most every 100ms".
debounce-throttle.js
function debounce(fn, delay) {
  let timer;
  return function(...args) {
    clearTimeout(timer);
    timer = setTimeout(() => fn.apply(this, args), delay);
  };
}

function throttle(fn, limit) {
  let waiting = false;
  return function(...args) {
    if (waiting) return;
    fn.apply(this, args);
    waiting = true;
    setTimeout(() => waiting = false, limit);
  };
}
When to use which?
  • Debounce → search-as-you-type, window resize end, autosave-on-stop-typing.
  • Throttle → scroll listener, mouse-move tracker, button-spam protection, drag handler.
Functional

11 · Currying & Memoization

Currying — one argument at a time

The Story A vending machine that asks for coins one at a time instead of all at once. Each time you drop a coin (an argument), it gives you back a new vending machine that's waiting for the next coin. Only when you've dropped all the coins does the snack actually drop. add(2)(3)(4) — each call returns a new function until all arguments are gathered.
curry.js
function curry(fn) {
  return function curried(...args) {
    if (args.length >= fn.length) return fn.apply(this, args);
    return (...next) => curried(...args, ...next);
  };
}

const add = curry((a, b, c) => a + b + c);
add(1)(2)(3);     // 6
add(1, 2)(3);     // 6

Memoization — remember past answers

The Story Sarah's a librarian. The first time someone asks "what's the sum of digits of 9876?" she actually computes it. Second time, she reaches into a drawer (the cache) and hands the answer back. Memoization is just a closure with a cache.
memoize.js
function memoize(fn) {
  const cache = new Map();
  return function(...args) {
    const key = JSON.stringify(args);
    if (!cache.has(key)) cache.set(key, fn.apply(this, args));
    return cache.get(key);
  };
}
Where is memoization risky?
When the function is impure (depends on time, network, user) — same arguments may not always give the same answer. Also when args are large objects — JSON-keying gets expensive. And the cache grows forever unless you cap it (LRU).
Modern Syntax

12 · ES6+ Goodies — the One-Liners You'll Be Asked About

Spread & Rest

const arr2 = [...arr1, 4];      // spread
function sum(...nums) { ... } // rest

Same syntax, opposite meaning. Spread expands, rest collects.

Destructuring

const { name, age = 18 } = user;
const [first, ...rest] = arr;

Pull values out of objects/arrays in one line. Default values fill in for undefined.

Optional chaining

user?.address?.city
api?.fetch?.()

Returns undefined instead of throwing when a link in the chain is null/undefined.

Nullish coalescing

value ?? 'fallback'

Falls back ONLY on null/undefined, not on 0 or "".

Map & Set

Map keeps insertion order, accepts any key type, has .size. Set stores unique values — handy one-liner: [...new Set(arr)] dedupes an array.

Generators

function* range(n) {
  for (let i = 0; i < n; i++) yield i;
}

Functions that pause at yield and resume on .next(). Foundation of async iterators.

Node.js · Foundation

13 · Why Node.js Exists

Node.js is JavaScript's V8 engine ripped out of Chrome and stitched onto a C library called libuv. The marriage gave the language two things it never had: file-system access and a way to do async I/O at scale.

The Story In 2009 Ryan Dahl was watching an upload progress bar in a browser. Underneath, Apache was spinning up a whole new thread for each upload — at 10,000 uploads, the server died from thread overhead. Ryan asked: "What if the server was one thread that just delegated the slow stuff and kept moving?" He took Chrome's V8 (fast JS engine), bolted on libuv (a C library that talks to the OS for non-blocking I/O), and Node.js was born. The whole point of Node is: one thread can serve thousands of concurrent connections, because no single connection ever blocks the thread.
flowchart LR subgraph N["Node.js Process"] JS[① Your JS Code] V8[② V8 Engine
parses + runs JS] EL[③ Event Loop
libuv-managed] TP[④ Thread Pool
4 threads default
fs · crypto · dns] end OS[(⑤ OS Kernel
epoll · kqueue · IOCP)] JS --> V8 --> EL EL -->|"non-blocking I/O"| OS EL -->|"blocking ops"| TP TP -->|"work done"| EL OS -->|"I/O ready"| EL style JS fill:#e8743b,stroke:#e8743b,color:#fff style V8 fill:#4a90d9,stroke:#4a90d9,color:#fff style EL fill:#38b265,stroke:#38b265,color:#fff style TP fill:#9b72cf,stroke:#9b72cf,color:#fff style OS fill:#d4a838,stroke:#d4a838,color:#000

The five parts that make Node tick

Your JS Code

Your app.js, your routes, your business logic. This is the part you write — everything below is the runtime carrying you.

Why it exists: well, it's the app. Nothing happens without it.

V8 Engine

Google's open-source JavaScript engine — same one in Chrome. It parses your JS, JIT-compiles it to machine code, runs it.

Why it exists: JS is a language, not a runtime. You need an engine to actually execute it. V8 was chosen because it's fast and embeddable.

Event Loop

The traffic cop that decides which callback runs next. Implemented in libuv (C). It cycles through phases — timers, I/O callbacks, idle, poll, check, close — picking up work in order.

Why it exists: with one thread, you need a strict scheduling rule for callbacks, or chaos. The event loop IS that rule.

Thread Pool

libuv keeps a pool of background threads (4 by default, configurable via UV_THREADPOOL_SIZE) for operations the OS can't do non-blocking — file I/O, DNS lookups, crypto, zlib.

Why it exists: not every operation has a non-blocking OS API. Without the pool, fs.readFile would block the event loop and freeze the server.

OS Kernel

The actual non-blocking I/O facilities of the operating system: epoll on Linux, kqueue on macOS/BSD, IOCP on Windows. libuv hides the differences.

Why it exists: network I/O genuinely is async at the OS level — Node didn't invent it, libuv just exposes it cleanly.

So what? "Node is single-threaded" is half the truth. Your JavaScript runs on one thread. But behind the curtain, libuv keeps 4+ background threads working on file/crypto operations, and the OS handles network I/O entirely on its own. That's how a single Node process serves 10,000 concurrent connections without breaking a sweat.
Async

14 · The Node Event Loop — Phases

The browser event loop has two queues. The Node event loop has SIX phases. Interviewers ask about the order because most race-condition bugs in Node trace back to "I thought my callback ran before that other callback."

flowchart LR T[① Timers
setTimeout · setInterval] --> P[② Pending
callbacks deferred from prev tick] P --> I[③ Idle / Prepare
internal use] I --> PO[④ Poll
fetch new I/O · execute callbacks] PO --> CH[⑤ Check
setImmediate] CH --> CL[⑥ Close
socket.on close] CL --> T style T fill:#e8743b,stroke:#e8743b,color:#fff style P fill:#d4a838,stroke:#d4a838,color:#000 style I fill:#7b8599,stroke:#7b8599,color:#fff style PO fill:#4a90d9,stroke:#4a90d9,color:#fff style CH fill:#38b265,stroke:#38b265,color:#fff style CL fill:#e05252,stroke:#e05252,color:#fff
The Story Picture a postman doing a daily route through six neighbourhoods in fixed order. Each neighbourhood is a phase. He starts at Timers and delivers anything whose timer has elapsed. Then he moves to the Poll neighbourhood — this is the busiest one, where most of his packages (network I/O, file reads) live. Then Check, where setImmediate callbacks live. Then Close for any socket-close callbacks. After each neighbourhood (and even between deliveries within one), he stops, opens his special microtask envelope, and processes everything in there — that's process.nextTick and resolved Promises. Then he carries on. Loop forever.

Timers

Runs setTimeout and setInterval callbacks whose threshold has elapsed. "5ms timeout" doesn't mean exactly 5ms — it means at least 5ms.

Pending

Internal — TCP error callbacks deferred from the previous loop tick. Rarely your code.

Poll

The big one. Picks up new I/O events (incoming requests, completed reads) and runs their callbacks. If the poll queue is empty, the loop may block here waiting for I/O.

Check

Runs setImmediate callbacks. The ONLY phase that runs them. Use setImmediate when you want "after the current poll cycle".

Close

Runs close-event callbacks like socket.on('close').

Microtasks (every gap)

process.nextTick queue first, then Promise queue. Drained between every operation, not just between phases. nextTick beats Promises.

Async · Ordering

15 · process.nextTick · setImmediate · setTimeout(0)

This is the trick question every Node interviewer keeps in their pocket. The three look like "do it later" but they're scheduled in different phases.

APIWhen it runsBeats
process.nextTick(cb)Before any other I/O / timer — drained right after the current opEverything below
Promise .then(cb)Microtask queue, after nextTickTimers & Immediate
setTimeout(cb, 0)Timers phase, ≥1ms later
setImmediate(cb)Check phase, after Poll
order-puzzle-node.js
setTimeout(() => console.log('timeout'), 0);
setImmediate(() => console.log('immediate'));
process.nextTick(() => console.log('nextTick'));
Promise.resolve().then(() => console.log('promise'));

// nextTick → promise → (timeout or immediate, order varies in main module)
// Inside an I/O callback, setImmediate ALWAYS beats setTimeout(0).
When should I prefer setImmediate over setTimeout(0)?
Inside an I/O callback when you want to "yield and continue after the current poll cycle". setImmediate is the documented contract for that. setTimeout(0) can drift to ≥1ms and depends on system clock.
Why is process.nextTick dangerous?
It runs BEFORE the loop continues. A recursive nextTick can starve I/O — your server stops accepting connections because the loop never reaches the Poll phase. Use it carefully; setImmediate is usually safer.
I/O

16 · Streams & Buffers

Streams are how Node processes data that doesn't fit in memory — like a 5GB log file or a video upload. Instead of loading the whole thing, you process it in chunks as it flows past.

The Story Imagine a fire brigade passing buckets of water down a line. The water (your data) never sits in any one person's hands for long — each person grabs a bucket, processes it, hands it on. That's a stream. The alternative (the non-stream way) is to fill an entire swimming pool first and then carry it — fine for a glass of water, impossible for a fire.

The four stream types

Readable

You can read FROM it. Examples: fs.createReadStream(), http.IncomingMessage (the request).

Writable

You can write TO it. Examples: fs.createWriteStream(), http.ServerResponse.

Duplex

Both readable and writable, independent channels. Example: TCP sockets.

Transform

Duplex where output is computed from input. Examples: zlib.createGzip(), crypto.createCipher().

stream-pipeline.js
const { pipeline } = require('stream/promises');
const fs = require('fs');
const zlib = require('zlib');

// gzip a 5GB file using ~64KB of RAM
await pipeline(
  fs.createReadStream('huge.log'),
  zlib.createGzip(),
  fs.createWriteStream('huge.log.gz')
);
What is backpressure?
When the writable side is slower than the readable side, data piles up in memory. Streams handle this automatically: readable.pipe(writable) pauses the source when the destination's internal buffer fills up. Use pipeline() instead of raw .pipe() — it propagates errors and cleans up streams properly.
What is a Buffer?
A fixed-size chunk of raw binary memory outside V8's heap. Used to handle bytes — files, network packets, images. Created with Buffer.alloc(n) (zero-filled, safe) or Buffer.from(data). Avoid the deprecated new Buffer() — it allocates uninitialized memory and was a known security hole.
Events

17 · EventEmitter — The Pub/Sub at Node's Core

Half of Node is built on EventEmitter — streams, HTTP servers, child processes, sockets all extend it. Knowing the API also signals you understand the observer pattern.

emitter.js
const { EventEmitter } = require('events');
const bus = new EventEmitter();

bus.on('order:placed', (id) => console.log('email ' + id));
bus.on('order:placed', (id) => console.log('metric ' + id));
bus.emit('order:placed', 42);

// 'once' — listener auto-removed after first fire
// 'off' / 'removeListener' — detach
// default max listeners = 10 (warning above)
Why does Node print "MaxListenersExceededWarning"?
Each emitter has a soft limit (default 10) of listeners per event. Going over is usually a leak — typically you forgot to off() a listener inside a handler that re-runs. Bump the limit with setMaxListeners(20) only if you genuinely need more — otherwise fix the leak.
Scaling

18 · Cluster vs Worker Threads

One Node process uses one CPU core. But your server has 8 cores. How do you use them all?

The Story Imagine you run a busy restaurant with one chef and the orders are stacking up. You have two options. Option A: open identical clones of the same restaurant next door — same menu, separate kitchens. A host out front routes diners to whichever clone is free. That is cluster — multiple Node processes, each with its own memory, sharing the same listening port. Option B: keep one restaurant but hire specialised assistants (sous-chefs) for the slow tasks. The head chef keeps taking orders; the assistants chop, mix, plate in parallel and hand back finished work. That is worker threads — multiple threads inside one process, sharing memory.
ClusterWorker Threads
UnitOS processThread inside one process
MemorySeparate (no shared state)Shared via SharedArrayBuffer
Best forScaling HTTP servers across coresCPU-heavy tasks (image processing, parsing, hashing)
Crash blast radiusOne process — others surviveOne thread — but a bad process.exit() kills all
ToolingBuilt-in cluster module · PM2 · K8sBuilt-in worker_threads
When should I prefer worker threads over cluster?
When the bottleneck is CPU on a single request — e.g., an endpoint that compresses a 10MB payload and blocks the event loop for 200ms. A cluster process would still freeze for 200ms on that one request; a worker thread offloads the compression and the main thread keeps serving others.
Process

19 · Child Processes — spawn vs exec vs fork

spawn

Launch any binary, stream its stdout/stderr. Best for long-running or large-output processes — never buffers, won't OOM.

spawn('ffmpeg', ['-i', ...])

exec

Run a shell command, get the FULL stdout/stderr in a callback once it's done. Buffered — careful with large output (default 1MB cap).

exec('ls -la', cb)

fork

Special-case spawn that runs another Node script with an IPC channel for message passing. Used for cluster.

fork('./worker.js')
Security warning. Never pass user input to exec — it goes through a shell, opening you to command injection. Use spawn with an args array, where arguments are passed safely without shell parsing.
Modules

20 · CommonJS vs ES Modules

Node started life with CommonJS (require / module.exports) because ES modules didn't exist when Node was born. Today both work, and the difference matters in interviews.

CommonJSES Modules
Syntaxconst x = require('x')import x from 'x'
LoadingSynchronousAsynchronous
ResolutionAt runtime, dynamicStatic, at parse time
Tree-shakingNoYes (bundlers can drop unused exports)
top-level awaitNoYes
TriggerDefault for .js if no "type""type":"module" in package.json or .mjs
Can I require() an ESM module?
Not directly with the classic require in older Node. Use dynamic import('./esm-mod.js') which returns a Promise and works from CJS. Newer Node (≥22) added experimental require(esm) for synchronous interop.
What is the module wrapper in CJS?
Before running your file, Node wraps it in a function: (function(exports, require, module, __filename, __dirname) { /* your code */ }). That's why those five identifiers are "magically available" inside every CJS file — they're just function arguments.
Express

21 · Express & the Middleware Pipeline

Express is a thin layer over Node's http module. Its core idea is the middleware pipeline — a function chain that each request walks through.

The Story Picture an airport. A passenger (the request) walks through a series of stations: check-in, security scan, passport control, boarding. Each station can: stamp the passport (mutate the request), reject the passenger (send a response), or wave them on (call next()). Express middleware is exactly that — a function with signature (req, res, next) chained in order. The first one to call res.send() ends the journey; the rest never run. The first one to call next(err) jumps to the error-handling station.
express-pipeline.js
const app = express();

app.use(express.json());                         // 1. parse body
app.use((req, res, next) => { req.id = crypto.randomUUID(); next(); }); // 2. tag
app.use(authMiddleware);                          // 3. auth or 401
app.get('/users/:id', getUser);                  // 4. handler

// 5. error-handling middleware — 4 args is the magic signature
app.use((err, req, res, next) => {
  console.error(err);
  res.status(500).json({ error: 'oops' });
});
How does the error-handling middleware know it's the error one?
Pure convention: Express checks the function's .length. A 4-arg function (err, req, res, next) is treated as an error handler. It only runs when something upstream calls next(err) or throws inside an async handler that you wrapped properly.
Errors

22 · Error Handling — the Right Way

Operational errors

Expected runtime problems: bad user input, network timeout, DB unavailable. Catch them, respond gracefully, log them.

Programmer errors

Bugs: undefined property, wrong type, logic mistake. You can't recover. Log, then crash and let the supervisor (PM2 / K8s) restart.

Should I catch uncaughtException and keep running?
No. By the time it fires, the process state is undefined — half-mutated objects, leaked file descriptors. Log the error, try to flush logs gracefully, then call process.exit(1) and let your orchestrator restart. The official Node guidance is "crash on programmer errors".
How do I forward async errors in Express?
In modern Express (≥5) async route handlers' rejections auto-forward to the error middleware. In Express 4 you must wrap them:
const wrap = (fn) => (req, res, next) => Promise.resolve(fn(req, res, next)).catch(next);
app.get('/x', wrap(async (req, res) => { ... }));
Production

23 · Memory Leaks & Debugging

"My Node service starts at 200MB and slowly creeps to 2GB then dies" — every Node engineer has lived this. The four usual suspects:

① Forgotten listeners

Adding emitter.on(...) inside a function that runs on every request — and never calling off. Each request piles a new closure on. Look for the MaxListenersExceeded warning.

② Closures holding big data

A timer, callback, or cache referencing a big object keeps the entire object alive. Audit globals; weak references (WeakMap, WeakRef) can help.

③ Unbounded caches

An in-memory Map that grows forever. Use lru-cache with a max size or TTL.

④ Module-level state in long-lived processes

Anything assigned at the top of a file lives forever. Resist temptation to keep "just one global counter".

Tools

  • node --inspect app.js + Chrome DevTools → live heap snapshots, CPU profiles.
  • --prof + node --prof-process → V8 tick profiler.
  • clinic.js doctor / flame / bubbleprof → opinionated diagnostic with auto-recommendations.
  • process.memoryUsage() → quick log line; watch heapUsed trend over time.
Security

24 · Security Cheat Sheet

Inputs

  • Validate & sanitize every body / query / param (zod, joi).
  • Never pass user input to exec, eval, or template strings sent to the shell.
  • Cap body size with express.json({ limit: '100kb' }) — defends against memory DoS.

Headers

  • Use helmet() middleware — sets ~12 secure-by-default headers (CSP, HSTS, X-Frame-Options).
  • Set cors() with an allowlist, never * on credentialed endpoints.

Auth

  • Hash passwords with bcrypt or argon2 — never SHA-256.
  • Sign JWTs with strong secrets / RS256; set short TTL + refresh tokens.
  • Store secrets in env / vault — never in git.

Dependencies

  • Run npm audit in CI; pin versions with a lockfile.
  • Use npm ci (not install) in CI for reproducible builds.
  • Drop unused deps — every package is supply-chain risk.
Quickfire

25 · Rapid-Fire Q&A

The remaining short-form questions interviewers fire when time is running out. Memorise the one-liners.

Why is Node.js single-threaded but scalable?
JS runs on one thread, but I/O is delegated to libuv (thread pool) and the OS kernel (epoll/kqueue). The thread is rarely blocked, so a single process can serve thousands of concurrent connections.
What is V8?
Google's open-source JavaScript engine, written in C++. Same one in Chrome. JIT-compiles JS to optimized machine code. Node.js embeds it.
What is libuv?
A C library that gives Node its event loop, thread pool, and cross-platform async I/O abstractions over epoll (Linux), kqueue (macOS), IOCP (Windows).
Difference between fs.readFile and fs.createReadStream?
readFile loads the entire file into memory before the callback fires — fine for small configs, dangerous for large files. createReadStream emits chunks as they're read — constant-memory, scales to any file size.
What is __dirname?
In CJS, the absolute path of the directory the current file lives in. Not available in ESM — use import.meta.url + fileURLToPath instead.
How does require caching work?
First require('x') loads + executes x and caches its module.exports. Every subsequent require('x') returns the same exports object — without re-running the file. That's why module-level state behaves like a singleton.
What is the difference between exit codes?
0 = success, 1 = generic failure, >1 = specific error class. Set with process.exit(code). Orchestrators (PM2, Kubernetes) often restart on non-zero.
What is the package-lock.json for?
Records the EXACT version of every package and sub-dependency installed, so npm ci on another machine produces the identical tree. Without it, two builds days apart can diverge if a sub-dep released a patch.
What's nodemon doing under the hood?
Watches your files with chokidar, kills the running Node process on change, restarts it. Pure dev convenience — never use in production; use PM2 / systemd.
Why is JSON.parse blocking?
It runs synchronously on the event loop. A 100MB JSON string can freeze the server for hundreds of ms. For huge payloads, stream-parse with stream-json or offload to a worker thread.
Difference between res.send, res.json, res.end?
res.end is raw — closes the response, no Content-Type. res.send auto-detects type, sets headers, supports strings/buffers/objects. res.json stringifies + sets application/json. Use res.json for APIs, res.end when streaming a custom body.
What is CORS and why does it bite Node devs?
Cross-Origin Resource Sharing — browser security policy that blocks JS on a.com from calling b.com unless b.com sends explicit Access-Control-Allow-Origin headers. Solve in Express with the cors middleware, configured with an explicit origin allowlist.
What's the difference between app.use and app.get?
app.use mounts middleware that runs for ALL HTTP methods at a path prefix. app.get mounts a handler for GET requests at an exact path. Order matters — middleware must be declared before the routes that depend on it.
Final tip. The interviewer rarely cares whether your answer is textbook-perfect. They care whether you can explain why the design is the way it is, and what breaks if it weren't. Every story above is built around the "why" — keep that pattern in your answers, and you'll sound like an engineer instead of a flashcard.

Did this JS & Node.js guide click? If it helped, tap the ❤️ — that's how I know it landed.