A complete walkthrough of the questions actually asked in JS and Node.js rounds — from hoisting to the event loop, from closures to streams. Each concept is taught as a small story so the answer sticks the first time.
Before any specific question makes sense, the interviewer wants to know if you have the right mental picture of what is even happening when JS runs your code. Get this picture right and half the trick questions answer themselves.
Hoisting is the feature that surprises every beginner: variables seem to "exist" before they were declared. The interview question is rarely "what is hoisting?" — it is a code snippet where the answer depends on whether you used var, let, or const.
var names go in the notebook with the value undefined. let and const names also go in the notebook — but with a red sticker that says "do not touch yet". On the second pass, JS actually executes the lines top to bottom, peeling off the red stickers when it reaches the declaration. That red-sticker zone — between the top of the scope and the actual let/const line — is called the Temporal Dead Zone (TDZ).
// var is hoisted with value undefined console.log(a); // undefined (no error!) var a = 10; // let / const are hoisted but in TDZ console.log(b); // ❌ ReferenceError: Cannot access 'b' before initialization let b = 20;
Function-scoped. Hoisted & initialised to undefined. Re-declaration allowed. Avoid in modern code.
Block-scoped. Hoisted but in TDZ until the line runs. Re-declaration in the same scope throws. Re-assignment allowed.
Block-scoped. TDZ. Cannot be re-assigned. The binding is constant — but if it points to an object, the object's contents can still change.
let/const variable is hoisted (it exists in the scope) and when its declaration line is actually executed. Touching it in that window throws a ReferenceError. The TDZ exists so that modern variables behave predictably — you can't accidentally read or write them before they are properly defined.const truly immutable?const arr = [1,2]; arr.push(3); works fine — the array's identity (the reference) didn't change. To freeze contents use Object.freeze(arr) (shallow) or a deep-freeze utility for nested objects.Closures are the single most-asked JS concept. They are also the foundation of currying, memoization, module patterns, and most of the React Hook tricks you'll see later.
function makeCounter() { let count = 0; // lives in the backpack return function() { count++; return count; }; } const c = makeCounter(); c(); // 1 c(); // 2 c(); // 3 — count survives between calls
for (var i = 0; i < 3; i++) { setTimeout(() => console.log(i), 100); } // prints 3, 3, 3 — all three callbacks share the SAME `i` (var is function-scoped)
Replace var with let and you get 0, 1, 2 — because let creates a fresh binding for each loop iteration, so each callback gets its own backpack with its own i.
this Keyword"What is this?" is JavaScript's most-asked, most-mistaken question. The trick is to remember that this isn't decided when the function is written — it's decided when the function is called.
this as the speaker pointing at the audience. The pronoun "you" doesn't have a fixed meaning — it depends on who the speaker is pointing at right now. The same way, this doesn't have a fixed meaning — it depends on how the function is called. There are exactly four "ways to call" — and each one decides what this points to.
| Call style | Example | this is… |
|---|---|---|
| 1. Method call | obj.fn() | obj |
| 2. Plain call | fn() | undefined in strict mode, else globalThis |
3. new call | new Fn() | the freshly created object |
| 4. Explicit bind | fn.call(x) · fn.apply(x) · fn.bind(x) | x |
Arrow functions don't have their own this. They borrow it from the surrounding scope at the moment they were defined. That is exactly why we love them inside callbacks — no more const self = this; dance.
const user = { name: 'Sarah', greet() { console.log('Hi ' + this.name); } }; user.greet(); // "Hi Sarah" — method call const g = user.greet; g(); // "Hi undefined" — plain call, this is global g.call(user); // "Hi Sarah" — explicit bind
call, apply, and bind?call(thisArg, a, b, c) — invokes the function immediately, args passed individually.apply(thisArg, [a,b,c]) — same, but args as an array.bind(thisArg, ...) — does NOT invoke; returns a new function with this permanently bound. Useful for event handlers that you'll attach later.setTimeout(this.fn, 1000) sometimes lose this?this.fn strips the method off the object — it becomes a plain function reference. When the timer fires, JS calls it as a plain call (rule 2), so this becomes global / undefined. Fix it by binding (this.fn.bind(this)) or wrapping in an arrow (() => this.fn()).JavaScript doesn't have classes the way Java does — even the class keyword is sugar over prototypes. Once you understand the prototype chain, every "why does arr.map exist?" question answers itself.
__proto__) to a "parent" object. When you ask for a property, JS walks up the chain until it finds it — or hits null at the top.
const arr = [1, 2, 3]; // arr ─→ Array.prototype ─→ Object.prototype ─→ null arr.map(...); // found on Array.prototype arr.hasOwnProperty(0); // found on Object.prototype
class Animal { constructor(name) { this.name = name; } speak() { console.log(this.name + ' makes a noise'); } } // is roughly equivalent to: function Animal(name) { this.name = name; } Animal.prototype.speak = function() { console.log(this.name + ' makes a noise'); };
__proto__ and prototype?prototype lives on functions (specifically constructor functions). It's the object that becomes the prototype of instances created with new.
__proto__ (or Object.getPrototypeOf(obj)) lives on every object. It's the actual link the prototype chain walks. So new Animal().__proto__ === Animal.prototype.
class?function Dog(name) { Animal.call(this, name); // 1. inherit fields } Dog.prototype = Object.create(Animal.prototype); // 2. inherit methods Dog.prototype.constructor = Dog; // 3. fix constructor pointer
Every async question is the event loop wearing a different hat. If you only memorise one diagram for your interview, memorise this one.
setTimeout callbacks land in the far tray (macrotasks). That single rule explains the answer to most async puzzles.
console.log('A'); setTimeout(() => console.log('B'), 0); Promise.resolve().then(() => console.log('C')); console.log('D'); // Output: A, D, C, B // 1. A and D run synchronously (call stack) // 2. Stack empties → drain microtasks → C // 3. Then take one macrotask → B
.then / .catch / .finally callbacks, queueMicrotask(), and MutationObserver callbacks. They run before the next macrotask, after the current one finishes.setTimeout, setInterval, I/O callbacks, UI events (click, scroll), setImmediate in Node.Promises gave us a cleaner way out of "callback hell". async/await then gave us a cleaner way out of .then chains. Underneath, it's all still the event loop.
.then and .catch. async/await is the same buzzer, but you're allowed to write your code as if you were standing at the counter — JS handles the walking-away for you behind the scenes.
Waits for ALL to fulfil. If any one rejects → the whole thing rejects immediately. Use when every result is required (e.g. fetch user + their posts + their settings).
Waits for ALL to finish, regardless of success or failure. Returns an array of {status, value/reason}. Use when you want a partial result (e.g. analytics dashboard with optional widgets).
Settles with whichever promise settles first — fulfilled OR rejected. Useful for timeouts: race a fetch against a 5-second timer.
Resolves with the first FULFILLED one. Only rejects if ALL reject (with an AggregateError). Use for fallback mirrors — "give me whichever CDN responds first."
Behaviour-wise: nothing. Under the hood, await x is exactly x.then(result => ...rest of the function). The compiler rewrites your linear-looking function into a state machine of .then calls.
Readability-wise: huge. async/await keeps the code linear, lets you use try/catch for errors, and works naturally with loops (for...of + await).
// ❌ does NOT wait — forEach ignores the returned promise users.forEach(async u => { await save(u); }); // ✅ runs sequentially for (const u of users) await save(u); // ✅ runs in parallel await Promise.all(users.map(save));
.catch a rejected promise?unhandledRejection event. Since Node 15 the default is to crash the process (good — surfaces bugs early). In browsers it shows up in DevTools console. Always either .catch or wrap your await in try/catch.JavaScript's loose equality is the source of the language's worst-known memes. Interviewers love it because it forces you to demonstrate that you actually understand type coercion.
=== is strict — same type AND same value, or you lose. Judge == is lenient — she'll convert your types behind the scenes to "give you the benefit of the doubt." Most production code uses Judge Strict because Judge Lenient's rulings are unpredictable: 0 == '' is true, null == undefined is true, but null == 0 is false. Use === by default.
[] == ![] // true — !! both sides become 0 0 == '' // true — '' becomes 0 0 == '0' // true — '0' becomes 0 '' == '0' // false — string comparison, no coercion null == undefined // true — by spec null == 0 // false — null only equals undefined NaN == NaN // false — NaN is never equal to anything
Only seven values are falsy: false, 0, -0, 0n, "", null, undefined, NaN. Everything else (including [] and {}!) is truthy.
?? and ||?|| falls back on any falsy value. ?? falls back ONLY on null or undefined.
const port = config.port || 3000; // 0 → 3000 ❌ (user wanted 0) const port = config.port ?? 3000; // 0 → 0 ✅
NaN?Number.isNaN(x) (strict, only true for the actual NaN value). Avoid the older global isNaN() which coerces — isNaN('hello') returns true because it tries to convert the string first.The bug story behind this question is identical at every company: "I changed one item in userB and somehow userA changed too."
const a = { name: 'Raj', addr: { city: 'Pune' } }; // SHALLOW — only top-level keys are duplicated const b = { ...a }; b.addr.city = 'Delhi'; console.log(a.addr.city); // 'Delhi' — leaked! // DEEP — modern, built-in, handles cycles, Maps, Sets, Dates const c = structuredClone(a); c.addr.city = 'Delhi'; console.log(a.addr.city); // 'Pune' — safe
JSON.parse(JSON.stringify(obj)) a bad deep clone?undefined, functions, symbols. It turns Date into a string, Map/Set into {}. It throws on circular references. Use structuredClone() instead (built into modern Node and browsers).Both limit how often a function runs. They're cousins but solve different problems. Interviewers love asking you to implement both from scratch.
function debounce(fn, delay) { let timer; return function(...args) { clearTimeout(timer); timer = setTimeout(() => fn.apply(this, args), delay); }; } function throttle(fn, limit) { let waiting = false; return function(...args) { if (waiting) return; fn.apply(this, args); waiting = true; setTimeout(() => waiting = false, limit); }; }
add(2)(3)(4) — each call returns a new function until all arguments are gathered.
function curry(fn) { return function curried(...args) { if (args.length >= fn.length) return fn.apply(this, args); return (...next) => curried(...args, ...next); }; } const add = curry((a, b, c) => a + b + c); add(1)(2)(3); // 6 add(1, 2)(3); // 6
function memoize(fn) { const cache = new Map(); return function(...args) { const key = JSON.stringify(args); if (!cache.has(key)) cache.set(key, fn.apply(this, args)); return cache.get(key); }; }
const arr2 = [...arr1, 4]; // spread function sum(...nums) { ... } // rest
Same syntax, opposite meaning. Spread expands, rest collects.
const { name, age = 18 } = user; const [first, ...rest] = arr;
Pull values out of objects/arrays in one line. Default values fill in for undefined.
user?.address?.city api?.fetch?.()
Returns undefined instead of throwing when a link in the chain is null/undefined.
value ?? 'fallback'
Falls back ONLY on null/undefined, not on 0 or "".
Map keeps insertion order, accepts any key type, has .size. Set stores unique values — handy one-liner: [...new Set(arr)] dedupes an array.
function* range(n) { for (let i = 0; i < n; i++) yield i; }
Functions that pause at yield and resume on .next(). Foundation of async iterators.
Node.js is JavaScript's V8 engine ripped out of Chrome and stitched onto a C library called libuv. The marriage gave the language two things it never had: file-system access and a way to do async I/O at scale.
Your app.js, your routes, your business logic. This is the part you write — everything below is the runtime carrying you.
Why it exists: well, it's the app. Nothing happens without it.
Google's open-source JavaScript engine — same one in Chrome. It parses your JS, JIT-compiles it to machine code, runs it.
Why it exists: JS is a language, not a runtime. You need an engine to actually execute it. V8 was chosen because it's fast and embeddable.
The traffic cop that decides which callback runs next. Implemented in libuv (C). It cycles through phases — timers, I/O callbacks, idle, poll, check, close — picking up work in order.
Why it exists: with one thread, you need a strict scheduling rule for callbacks, or chaos. The event loop IS that rule.
libuv keeps a pool of background threads (4 by default, configurable via UV_THREADPOOL_SIZE) for operations the OS can't do non-blocking — file I/O, DNS lookups, crypto, zlib.
Why it exists: not every operation has a non-blocking OS API. Without the pool, fs.readFile would block the event loop and freeze the server.
The actual non-blocking I/O facilities of the operating system: epoll on Linux, kqueue on macOS/BSD, IOCP on Windows. libuv hides the differences.
Why it exists: network I/O genuinely is async at the OS level — Node didn't invent it, libuv just exposes it cleanly.
The browser event loop has two queues. The Node event loop has SIX phases. Interviewers ask about the order because most race-condition bugs in Node trace back to "I thought my callback ran before that other callback."
setImmediate callbacks live. Then Close for any socket-close callbacks. After each neighbourhood (and even between deliveries within one), he stops, opens his special microtask envelope, and processes everything in there — that's process.nextTick and resolved Promises. Then he carries on. Loop forever.
Runs setTimeout and setInterval callbacks whose threshold has elapsed. "5ms timeout" doesn't mean exactly 5ms — it means at least 5ms.
Internal — TCP error callbacks deferred from the previous loop tick. Rarely your code.
The big one. Picks up new I/O events (incoming requests, completed reads) and runs their callbacks. If the poll queue is empty, the loop may block here waiting for I/O.
Runs setImmediate callbacks. The ONLY phase that runs them. Use setImmediate when you want "after the current poll cycle".
Runs close-event callbacks like socket.on('close').
process.nextTick queue first, then Promise queue. Drained between every operation, not just between phases. nextTick beats Promises.
This is the trick question every Node interviewer keeps in their pocket. The three look like "do it later" but they're scheduled in different phases.
| API | When it runs | Beats |
|---|---|---|
process.nextTick(cb) | Before any other I/O / timer — drained right after the current op | Everything below |
Promise .then(cb) | Microtask queue, after nextTick | Timers & Immediate |
setTimeout(cb, 0) | Timers phase, ≥1ms later | — |
setImmediate(cb) | Check phase, after Poll | — |
setTimeout(() => console.log('timeout'), 0); setImmediate(() => console.log('immediate')); process.nextTick(() => console.log('nextTick')); Promise.resolve().then(() => console.log('promise')); // nextTick → promise → (timeout or immediate, order varies in main module) // Inside an I/O callback, setImmediate ALWAYS beats setTimeout(0).
setImmediate over setTimeout(0)?setImmediate is the documented contract for that. setTimeout(0) can drift to ≥1ms and depends on system clock.process.nextTick dangerous?setImmediate is usually safer.Streams are how Node processes data that doesn't fit in memory — like a 5GB log file or a video upload. Instead of loading the whole thing, you process it in chunks as it flows past.
You can read FROM it. Examples: fs.createReadStream(), http.IncomingMessage (the request).
You can write TO it. Examples: fs.createWriteStream(), http.ServerResponse.
Both readable and writable, independent channels. Example: TCP sockets.
Duplex where output is computed from input. Examples: zlib.createGzip(), crypto.createCipher().
const { pipeline } = require('stream/promises'); const fs = require('fs'); const zlib = require('zlib'); // gzip a 5GB file using ~64KB of RAM await pipeline( fs.createReadStream('huge.log'), zlib.createGzip(), fs.createWriteStream('huge.log.gz') );
readable.pipe(writable) pauses the source when the destination's internal buffer fills up. Use pipeline() instead of raw .pipe() — it propagates errors and cleans up streams properly.Buffer.alloc(n) (zero-filled, safe) or Buffer.from(data). Avoid the deprecated new Buffer() — it allocates uninitialized memory and was a known security hole.Half of Node is built on EventEmitter — streams, HTTP servers, child processes, sockets all extend it. Knowing the API also signals you understand the observer pattern.
emitter.jsconst { EventEmitter } = require('events'); const bus = new EventEmitter(); bus.on('order:placed', (id) => console.log('email ' + id)); bus.on('order:placed', (id) => console.log('metric ' + id)); bus.emit('order:placed', 42); // 'once' — listener auto-removed after first fire // 'off' / 'removeListener' — detach // default max listeners = 10 (warning above)
off() a listener inside a handler that re-runs. Bump the limit with setMaxListeners(20) only if you genuinely need more — otherwise fix the leak.One Node process uses one CPU core. But your server has 8 cores. How do you use them all?
| Cluster | Worker Threads | |
|---|---|---|
| Unit | OS process | Thread inside one process |
| Memory | Separate (no shared state) | Shared via SharedArrayBuffer |
| Best for | Scaling HTTP servers across cores | CPU-heavy tasks (image processing, parsing, hashing) |
| Crash blast radius | One process — others survive | One thread — but a bad process.exit() kills all |
| Tooling | Built-in cluster module · PM2 · K8s | Built-in worker_threads |
Launch any binary, stream its stdout/stderr. Best for long-running or large-output processes — never buffers, won't OOM.
spawn('ffmpeg', ['-i', ...])
Run a shell command, get the FULL stdout/stderr in a callback once it's done. Buffered — careful with large output (default 1MB cap).
exec('ls -la', cb)
Special-case spawn that runs another Node script with an IPC channel for message passing. Used for cluster.
fork('./worker.js')
exec — it goes through a shell, opening you to command injection. Use spawn with an args array, where arguments are passed safely without shell parsing.
Node started life with CommonJS (require / module.exports) because ES modules didn't exist when Node was born. Today both work, and the difference matters in interviews.
| CommonJS | ES Modules | |
|---|---|---|
| Syntax | const x = require('x') | import x from 'x' |
| Loading | Synchronous | Asynchronous |
| Resolution | At runtime, dynamic | Static, at parse time |
| Tree-shaking | No | Yes (bundlers can drop unused exports) |
| top-level await | No | Yes |
| Trigger | Default for .js if no "type" | "type":"module" in package.json or .mjs |
require() an ESM module?require in older Node. Use dynamic import('./esm-mod.js') which returns a Promise and works from CJS. Newer Node (≥22) added experimental require(esm) for synchronous interop.(function(exports, require, module, __filename, __dirname) { /* your code */ }). That's why those five identifiers are "magically available" inside every CJS file — they're just function arguments.Express is a thin layer over Node's http module. Its core idea is the middleware pipeline — a function chain that each request walks through.
next()). Express middleware is exactly that — a function with signature (req, res, next) chained in order. The first one to call res.send() ends the journey; the rest never run. The first one to call next(err) jumps to the error-handling station.
const app = express(); app.use(express.json()); // 1. parse body app.use((req, res, next) => { req.id = crypto.randomUUID(); next(); }); // 2. tag app.use(authMiddleware); // 3. auth or 401 app.get('/users/:id', getUser); // 4. handler // 5. error-handling middleware — 4 args is the magic signature app.use((err, req, res, next) => { console.error(err); res.status(500).json({ error: 'oops' }); });
.length. A 4-arg function (err, req, res, next) is treated as an error handler. It only runs when something upstream calls next(err) or throws inside an async handler that you wrapped properly.Expected runtime problems: bad user input, network timeout, DB unavailable. Catch them, respond gracefully, log them.
Bugs: undefined property, wrong type, logic mistake. You can't recover. Log, then crash and let the supervisor (PM2 / K8s) restart.
uncaughtException and keep running?process.exit(1) and let your orchestrator restart. The official Node guidance is "crash on programmer errors".const wrap = (fn) => (req, res, next) => Promise.resolve(fn(req, res, next)).catch(next); app.get('/x', wrap(async (req, res) => { ... }));
"My Node service starts at 200MB and slowly creeps to 2GB then dies" — every Node engineer has lived this. The four usual suspects:
Adding emitter.on(...) inside a function that runs on every request — and never calling off. Each request piles a new closure on. Look for the MaxListenersExceeded warning.
A timer, callback, or cache referencing a big object keeps the entire object alive. Audit globals; weak references (WeakMap, WeakRef) can help.
An in-memory Map that grows forever. Use lru-cache with a max size or TTL.
Anything assigned at the top of a file lives forever. Resist temptation to keep "just one global counter".
node --inspect app.js + Chrome DevTools → live heap snapshots, CPU profiles.--prof + node --prof-process → V8 tick profiler.clinic.js doctor / flame / bubbleprof → opinionated diagnostic with auto-recommendations.process.memoryUsage() → quick log line; watch heapUsed trend over time.zod, joi).exec, eval, or template strings sent to the shell.express.json({ limit: '100kb' }) — defends against memory DoS.helmet() middleware — sets ~12 secure-by-default headers (CSP, HSTS, X-Frame-Options).cors() with an allowlist, never * on credentialed endpoints.bcrypt or argon2 — never SHA-256.npm audit in CI; pin versions with a lockfile.npm ci (not install) in CI for reproducible builds.The remaining short-form questions interviewers fire when time is running out. Memorise the one-liners.
fs.readFile and fs.createReadStream?readFile loads the entire file into memory before the callback fires — fine for small configs, dangerous for large files. createReadStream emits chunks as they're read — constant-memory, scales to any file size.__dirname?import.meta.url + fileURLToPath instead.require caching work?require('x') loads + executes x and caches its module.exports. Every subsequent require('x') returns the same exports object — without re-running the file. That's why module-level state behaves like a singleton.0 = success, 1 = generic failure, >1 = specific error class. Set with process.exit(code). Orchestrators (PM2, Kubernetes) often restart on non-zero.package-lock.json for?npm ci on another machine produces the identical tree. Without it, two builds days apart can diverge if a sub-dep released a patch.nodemon doing under the hood?chokidar, kills the running Node process on change, restarts it. Pure dev convenience — never use in production; use PM2 / systemd.JSON.parse blocking?stream-json or offload to a worker thread.res.send, res.json, res.end?res.end is raw — closes the response, no Content-Type. res.send auto-detects type, sets headers, supports strings/buffers/objects. res.json stringifies + sets application/json. Use res.json for APIs, res.end when streaming a custom body.a.com from calling b.com unless b.com sends explicit Access-Control-Allow-Origin headers. Solve in Express with the cors middleware, configured with an explicit origin allowlist.app.use and app.get?app.use mounts middleware that runs for ALL HTTP methods at a path prefix. app.get mounts a handler for GET requests at an exact path. Order matters — middleware must be declared before the routes that depend on it.Did this JS & Node.js guide click? If it helped, tap the ❤️ — that's how I know it landed.