← Back to Design & Development
Interview Prep

Java Interview Questions

OOP Β· Collections Β· Concurrency Β· JVM Β· Streams β€” explained the way you'd teach a friend over chai

01

The Four Pillars of OOP β€” Through Sarah's Coffee Shop

Almost every Java interview opens here. The trick is not to recite definitions β€” interviewers have heard "encapsulation is hiding data" five hundred times. They want to know if you can recognize these pillars in code.

Sarah runs a coffee shop. She has a CoffeeMachine behind the counter. Customers don't open it up to grind beans manually β€” they press a button. That hidden complexity is encapsulation. The shop also serves tea, smoothies, and coffee β€” all called "drinks" β€” that's polymorphism. Let's walk through each pillar with this shop in mind.
Explain the four pillars of OOP with real examples.

1. Encapsulation β€” hide the wires, expose the buttons

Encapsulation means bundling data (fields) and behavior (methods) inside a class, and exposing only what the outside world needs. Private fields, public methods. Think of Sarah's coffee machine β€” customers push brew(), they don't poke at internalGrinderRPM.

Encapsulation
public class CoffeeMachine {
    private int waterMl;          // hidden state
    private int beansGrams;
    private int grinderRpm = 1200;

    public Coffee brew(String type) {
        if (waterMl < 200) throw new IllegalStateException("Refill water");
        // internals hidden β€” customer just gets a Coffee back
        return new Coffee(type);
    }

    public void refillWater(int ml) { this.waterMl += ml; }
}
An ATM is the cleanest example of encapsulation. You insert a card and press buttons. You don't reach inside to count cash, log the transaction, or talk to the bank's database. The ATM exposes 4 buttons; behind those buttons sit thousands of lines of code.

2. Inheritance β€” the family resemblance

Inheritance lets a child class reuse fields and methods from a parent. Sarah's shop sells different drinks, but every drink has a price, a name, and a way to "serve." Instead of repeating those in Coffee, Tea, and Smoothie, we put them in a parent Drink.

Inheritance
abstract class Drink {
    protected String name;
    protected double price;

    public abstract void prepare();   // each child decides how

    public void serve() {                 // shared behavior
        System.out.println("Serving " + name + " for β‚Ή" + price);
    }
}

class Coffee extends Drink {
    public void prepare() { System.out.println("Brewing espresso..."); }
}

class Tea extends Drink {
    public void prepare() { System.out.println("Steeping leaves..."); }
}
Inheritance is the most overused tool in OOP. Prefer composition unless the relationship is truly an "is-a." A Stack is not an ArrayList β€” that's why java.util.Stack extending Vector is widely considered a design mistake.

3. Polymorphism β€” one call, many forms

Polymorphism means a single reference can point to different types and call the right method automatically. Sarah's barista holds a list of Drink β€” they call prepare() on each, and each drink does its own thing. The barista doesn't need a giant if/else chain.

Runtime polymorphism
List<Drink> orders = List.of(new Coffee(), new Tea(), new Coffee());

for (Drink d : orders) {
    d.prepare();   // resolves to Coffee.prepare() or Tea.prepare() at RUNTIME
    d.serve();
}

There are two flavors:

  • Compile-time (overloading) β€” same method name, different parameter list. The compiler picks based on arguments. Example: System.out.println(int) vs println(String).
  • Runtime (overriding) β€” child class redefines a parent method. The JVM picks based on the actual object type. This is what gives us "one call, many forms."

4. Abstraction β€” show what, hide how

Abstraction is the cousin of encapsulation. Encapsulation hides data; abstraction hides implementation details. You write to an interface ("what should happen"), not a concrete class ("how it happens").

Abstraction via interface
interface PaymentGateway {
    PaymentResult pay(double amount);
}

class Razorpay implements PaymentGateway { /* HTTP calls to RP */ }
class Stripe   implements PaymentGateway { /* HTTP calls to Stripe */ }

// Caller doesn't care which gateway:
PaymentGateway gw = pickCheapestGateway();
gw.pay(499.0);
The four pillars are not separate ideas β€” they reinforce each other. Encapsulation gives you safe state, inheritance gives you reuse, polymorphism gives you flexibility, and abstraction lets you swap implementations without breaking callers. Together they let Sarah add "Mango Smoothie" tomorrow without touching the barista's code.
When asked to "explain OOP," skip the textbook definitions. Walk through one consistent example (like the coffee shop) and show all four pillars in action. Interviewers remember the story, not the buzzwords.
02

String, the String Pool, and Why "hi" == "hi" is True

Raj is debugging a login bug. He compares two usernames with == and it works in tests but fails in production. He's just stumbled into the most-asked Java interview topic: how Strings live in memory.
Why is String immutable in Java? Where do String literals live?

The String Pool β€” a shared bookshelf

Java keeps a special area of memory called the String Pool (or "string intern table") inside the heap. When you write a literal like "hello", the JVM checks: is this exact text already on the shelf? If yes, hand back the existing reference. If no, place it on the shelf and hand back the new reference.

The pool in action
String a = "hello";             // goes into the pool
String b = "hello";             // reuses the pool reference
String c = new String("hello");  // FORCES a new object on the heap

System.out.println(a == b);          // true  β€” same pool reference
System.out.println(a == c);          // false β€” c is a fresh object
System.out.println(a.equals(c));     // true  β€” same characters
System.out.println(a == c.intern());   // true  β€” intern() puts c into the pool
The String Pool is like a school library's reference section. The librarian (JVM) keeps one copy of every popular book. Ten students saying "give me the dictionary" all get the same physical book. But if a student insists on buying their own copy off Amazon (new String(...)), they get a different physical book β€” even though the words inside are identical.

Why is String immutable?

Once a String is created, you can't change its characters. s.toUpperCase() returns a new String β€” the original is untouched. Why did Java's designers pick this?

  • Pool safety. If two variables share "hello" from the pool and one could mutate it, the other would see the change. Chaos.
  • Thread safety. Immutable objects are inherently safe to share across threads β€” no locks needed.
  • HashMap key safety. A String's hashCode() is computed once and cached. If the contents could change, the map would lose the entry.
  • Security. File paths, class names, URLs β€” all passed as Strings. If they could be mutated after a security check, an attacker could pass "safe.txt", get past the check, then change it to "/etc/passwd".

String vs StringBuilder vs StringBuffer

ClassMutable?Thread-safe?Use when
StringNoYes (immutable)Most cases β€” short text, keys, return values
StringBuilderYesNoBuilding text in a single thread (loops, parsers)
StringBufferYesYes (synchronized)Legacy β€” almost never the right choice today
Concatenating in a loop with + creates a new String every iteration β€” O(nΒ²) garbage. Use StringBuilder for loops. The compiler converts a single a + b + c expression into a StringBuilder under the hood, but it can't do that across loop iterations.
Strings are pooled, immutable, and that's why == sometimes "works" by accident. Always use .equals() for content comparison; == only tells you "same object reference."
03

The equals/hashCode Contract β€” A Pact, Not a Suggestion

Priya stores Employee objects in a HashSet. She adds an employee, then immediately checks contains() for the same person β€” and gets false. She forgot to override hashCode(). The set is using the default identity hash, so the "same" employee maps to two different buckets.
What's the contract between equals() and hashCode()? What happens if you violate it?

The contract in plain English

  • If a.equals(b) is true, then a.hashCode() == b.hashCode() MUST be true.
  • If hash codes are equal, equals may or may not be true (collisions are allowed).
  • Both methods must be deterministic β€” same input, same output, every call.
  • equals must be reflexive (a.equals(a)), symmetric (a.equals(b) == b.equals(a)), and transitive (a=b, b=c β†’ a=c).
Think of hashCode() as your house's pin code and equals() as your house number. The post office (HashMap) uses the pin code to deliver to the right neighborhood, then the house number to find the exact door. If two houses claim to be "the same address" (equals) but live in different pin codes (hashCode), the post office will look in the wrong neighborhood and never find the second one.

The right way to implement them

Correct equals/hashCode
public class Employee {
    private final String id;
    private final String email;

    @Override
    public boolean equals(Object o) {
        if (this == o) return true;            // shortcut
        if (!(o instanceof Employee e)) return false;
        return Objects.equals(id, e.id) &&
               Objects.equals(email, e.email);
    }

    @Override
    public int hashCode() {
        return Objects.hash(id, email);     // must use SAME fields as equals
    }
}

What breaks if you violate the contract?

  • HashSet/HashMap fails to find your object. You add it, you can't find it. Memory leak: the set grows forever with "duplicates" that aren't really duplicates.
  • Two equal objects in different buckets. Iterating the map shows both β€” looks like a bug from the outside.
  • Caches break silently. Spring's @Cacheable, Guava caches, anything keyed by your object β€” all return stale or missing data.
If you override equals using a mutable field and then change that field while the object is in a HashMap, the object becomes unreachable β€” the map looks in the wrong bucket. Lesson: prefer immutable fields (or at least immutable-while-in-the-map fields) for equals/hashCode.
Java records (since 14) auto-generate equals, hashCode, and toString from the components. If your class is a value carrier, use a record β€” one line eliminates an entire category of bugs.
04

== vs .equals() β€” and the Integer Cache Trap

This question is so common that interviewers expect a thorough answer with the famous "Integer cache" twist. If you can explain that, you're showing you know the JVM, not just the syntax.

What's the difference between == and equals()? Why does Integer.valueOf(127) == Integer.valueOf(127) return true but 128 == 128 return false?

The simple rule

  • == compares references for objects (same object in memory?), and values for primitives.
  • .equals() compares logical equality based on the class's contract.

The Integer cache

To save memory, the JVM pre-creates Integer objects for the range -128 to 127 and reuses them whenever you call Integer.valueOf(x) (which autoboxing also calls). Outside that range, every call creates a fresh object.

The cache in action
Integer a = 127;
Integer b = 127;
System.out.println(a == b);         // true  β€” both pulled from cache

Integer c = 128;
Integer d = 128;
System.out.println(c == d);         // false β€” two new objects
System.out.println(c.equals(d));    // true  β€” same value
Imagine a cafe that pre-prints menu cards 1–127 (always available, share them around). For numbers 128 and up, they print fresh cards on demand. Two customers asking for "menu 50" get the same shared card. Two customers asking for "menu 200" each get their own freshly-printed card β€” same content, different cards.
This is why interviews love it: developers who only know == from C/JavaScript get burned. Always use .equals() for object equality β€” even for Integer, Long, String, Date, and any wrapper type.
== answers "are these the same object?" β€” almost never the question you actually want. .equals() answers "are these logically the same?" β€” that's the question 99% of the time.
05

The Collections Framework β€” A Tour Through the Toolbox

A new dev, Aman, opens IntelliJ and types List. Autocomplete shows ArrayList, LinkedList, CopyOnWriteArrayList, Stack, Vector, and ten more. He freezes. Which one? When? Why are there so many?

The Collections Framework is huge but organized. Three main interfaces sit at the top: List, Set, and Map. Everything else is a specialization.

The mental map

InterfaceWhat it representsCommon implementations
ListOrdered, allows duplicatesArrayList, LinkedList, CopyOnWriteArrayList
SetNo duplicatesHashSet, LinkedHashSet, TreeSet
Queue / DequeFIFO / double-endedArrayDeque, LinkedList, PriorityQueue
MapKey→value pairsHashMap, LinkedHashMap, TreeMap, ConcurrentHashMap
Think of a kitchen drawer. List is a row of drawers in order β€” you can have two spoons next to each other. Set is a knife block β€” each slot holds exactly one unique item. Map is a labeled spice rack β€” every label maps to one specific jar.

How to choose, in 30 seconds

  • Need order + duplicates + index access? β†’ ArrayList. Default choice.
  • Need uniqueness, don't care about order? β†’ HashSet.
  • Need uniqueness in insertion order? β†’ LinkedHashSet.
  • Need uniqueness in sorted order? β†’ TreeSet.
  • Need keyβ†’value lookup? β†’ HashMap (single-thread) or ConcurrentHashMap (multi-thread).
  • Need sorted keys? β†’ TreeMap.
  • Need a stack/queue? β†’ ArrayDeque (faster than Stack/LinkedList).

Big-O cheat sheet

OperationArrayListLinkedListHashMapTreeMap
get / containsO(1) by index, O(n) by valueO(n)O(1) avgO(log n)
add at endO(1) amortizedO(1)O(1)O(log n)
add at middleO(n)O(1) if you have node ref, else O(n)β€”β€”
removeO(n)O(1) at ends, O(n) in middleO(1)O(log n)
Vector and Stack are legacy (synchronized) β€” avoid in new code. Use ArrayList, and wrap with Collections.synchronizedList() only if you actually need thread safety. Better: use a concurrent collection.
06

HashMap Internals β€” The Most-Asked Question in Java

Anvi opens an interview and the panel says, "Walk me through how HashMap works internally." She knows there are buckets and hash codes, but the details β€” load factor, chaining, treeification, resizing β€” that's where the real points hide.
Explain HashMap's internals. What's the load factor? When does it resize? When does it convert to a tree?

The structure β€” an array of buckets

Internally, a HashMap is a Node[] (called the "table") where each slot is called a bucket. Default initial size: 16. Each bucket either holds null, a single Node, a linked list of Nodes (when there are collisions), or a red-black tree (when collisions get bad).

The Node β€” what HashMap really stores
static class Node<K, V> {
    final int hash;
    final K key;
    V value;
    Node<K, V> next;   // linked list pointer for collisions
}

Put, step by step

  1. Compute key.hashCode().
  2. Apply a "spreading" function: hash = h ^ (h >>> 16). This mixes the high bits into the low bits, so even bad hashCodes spread well.
  3. Compute bucket index: index = (n - 1) & hash where n is the table size (always a power of 2, so this is equivalent to hash % n but faster).
  4. If the bucket is empty β†’ place the new Node.
  5. If occupied β†’ walk the chain. If a Node's key .equals() the new key β†’ replace the value. Otherwise β†’ append.
  6. If the chain length exceeds 8 AND the table size is β‰₯ 64 β†’ convert that bucket to a red-black tree (treeification). Lookups in that bucket go from O(n) to O(log n).
  7. If size > capacity * loadFactor β†’ resize. Default load factor is 0.75, so a 16-bucket table resizes when it hits 12 entries. The new table is double the size, and every entry is rehashed into it.
Picture a parking lot with 16 numbered rows. Each car (key) has a hash that picks a row. If a row already has a car, the new one parks behind it (linked list). When a row gets too crowded (>8 cars), the lot manager rebuilds that row as an organized lot with sub-spots (red-black tree). When the whole lot is 75% full, the city builds a bigger lot (32 rows) and re-parks every car. That's resizing.

Why load factor 0.75?

It's a balance. Lower load factor (e.g., 0.5) β†’ fewer collisions but wasted memory. Higher (e.g., 0.9) β†’ less memory but more collisions, slower lookups. 0.75 is the sweet spot picked from empirical testing.

The treeification fix (Java 8)

Pre-Java 8, a bucket with bad hashCodes (or worse, a malicious attacker) could degrade to O(n) β€” a denial-of-service vector. Java 8 added the tree conversion at threshold 8 β†’ guarantees O(log n) worst case for any single bucket.

HashMap is NOT thread-safe

Concurrent puts during resize can cause infinite loops (pre-Java 8, due to entry rotation) or lost data. Use ConcurrentHashMap for multi-threaded access β€” it locks individual bucket segments (Java 8+: per-bucket CAS), so reads are lock-free and writes only block on the same bucket.

If your key's hashCode is broken (e.g., always returns 0), every entry lands in the same bucket. With 1M entries, that's a 1M-long linked list / red-black tree. get() goes from O(1) to O(log n) at best. Always test hashCode() for distribution on real data.
HashMap is an array of buckets, each bucket is a linked list (or tree past 8 items), with a load factor of 0.75 triggering a resize that doubles capacity and rehashes everything. Memorize that one sentence and you can explain it for 5 minutes confidently.
07

ArrayList vs LinkedList β€” and Why You Probably Want ArrayList

Textbooks teach: "Use LinkedList for frequent inserts in the middle." Reality: even then, ArrayList usually wins. Let's see why.

Internally

  • ArrayList β€” backed by a contiguous Object[]. When full, it grows by 50% (Java 8+) and copies into a new array.
  • LinkedList β€” doubly-linked list of Nodes. Each Node holds a value plus two pointers (prev, next).

The real-world performance story

Modern CPUs love contiguous memory. ArrayList is a flat array β†’ CPU cache prefetches the next elements for free. LinkedList is scattered Nodes across the heap β†’ every .next is a potential cache miss, often 100x slower than a cache hit.

ArrayList is a stack of pancakes on a single plate β€” you can grab any one fast, and reaching the next is instant. LinkedList is pancakes scattered across 12 different tables, with handwritten notes pointing to where the next pancake is. Even reading sequentially is slow because you keep walking around.

When does LinkedList actually win?

Almost never in practice. The textbook answer says "frequent insertions in the middle" β€” but to insert in the middle, you first need to find the position, which is O(n) for LinkedList anyway (walking the chain). The only true win is when you already hold a Node reference and want O(1) insert/remove there β€” and that's a rare API need.

Real winning use case: Deque operations from both ends. But even there, ArrayDeque is usually faster.

Default to ArrayList. Reach for LinkedList only with measurements proving it's faster for your workload β€” that essentially never happens.
08

Exceptions β€” Checked, Unchecked, and Why People Argue About Them

Maya writes a method that reads a file and forgets to declare throws IOException. The compiler refuses to compile. She gets annoyed: "Why can't Java just trust me?" She's just met checked exceptions.

The hierarchy

Every error inherits from Throwable. Below it are two branches:

  • Error β€” JVM problems you can't recover from (OutOfMemoryError, StackOverflowError). Don't catch these.
  • Exception β€” application problems. Two sub-categories:
    • Checked (everything that extends Exception but NOT RuntimeException) β€” compiler forces you to catch or declare. Examples: IOException, SQLException.
    • Unchecked (extends RuntimeException) β€” compiler doesn't force anything. Examples: NullPointerException, IllegalArgumentException.
Checked exceptions are like a contract clause β€” the method's signature has to declare it, like a shipping label declaring "fragile, may break." Unchecked exceptions are unexpected accidents β€” a tire bursts mid-trip, no label warned you.

try-with-resources (Java 7+)

Anything implementing AutoCloseable can go in a try-with-resources block β€” Java auto-closes it, in reverse order, even if an exception is thrown.

Modern resource handling
try (BufferedReader r = Files.newBufferedReader(path);
     PreparedStatement ps = conn.prepareStatement(sql)) {

    // use r and ps

}  // ps.close() then r.close() β€” even if exception thrown

Common mistakes

  • Catching Exception or Throwable at the top. Hides bugs. Catch the narrowest type that's actually meaningful.
  • Empty catch blocks. "Just don't crash" β€” and now your team spends 3 hours debugging silent corruption. At minimum log it.
  • Wrapping and re-throwing without the cause. throw new RuntimeException("failed") loses the stack trace. Always pass the original: throw new RuntimeException("failed", e).
  • Returning from finally. Swallows exceptions silently. Don't.
Checked exceptions don't compose with lambdas β€” Stream's map can't accept a function that throws IOException. This forced workaround patterns. Many modern Java libraries (Spring, Guava) lean unchecked for this reason.
09

Generics & Type Erasure β€” What Happens to <T> at Runtime?

Karthik writes List<String> and List<Integer> and runs list1.getClass() == list2.getClass(). It returns true. He's just discovered that at runtime, both are just List β€” the type parameter has been erased.

What is type erasure?

Java generics are a compile-time feature only. The compiler uses <T> to type-check your code and insert casts, but the bytecode that ships to the JVM has no record of T. Wherever you wrote T, the bytecode says Object (or the upper bound, like Number for <T extends Number>).

What the compiler does
// What you write:
List<String> names = new ArrayList<>();
names.add("Sarah");
String first = names.get(0);

// What the JVM actually runs (after erasure):
List names = new ArrayList();
names.add("Sarah");
String first = (String) names.get(0);   // compiler-inserted cast
Generics are like sticky notes the compiler puts on your code. "This list only holds Strings!" the note says. The compiler reads the notes, makes sure you obey them, and then peels them off before the bytecode is shipped. The JVM never sees the notes.

The consequences

  • You can't do new T() β€” at runtime there is no T to instantiate.
  • You can't do obj instanceof List<String> β€” only instanceof List. The runtime can't see the type parameter.
  • You can't have arrays of generic types β€” new T[10] won't compile. (Arrays know their element type at runtime; generics don't.)
  • Bridge methods β€” when a generic class is overridden, the compiler may add invisible methods to keep the JVM's method dispatch happy.

Wildcards β€” ? extends vs ? super (PECS)

Mnemonic: PECS β€” Producer Extends, Consumer Super.

  • List<? extends Number> β€” you can read Numbers out (it's a producer), but you can't add anything (compiler can't know if it's a List of Integer or Double).
  • List<? super Integer> β€” you can add Integers (it's a consumer), but reading gives you Object (compiler can't know the upper bound).
Generics give you compile-time safety with zero runtime cost. The "cost" of erasure is some lost reflection power β€” small price for catching ClassCastException at compile time.
10

Immutability and the Three Faces of final

What does final mean?

  • final variable β€” value can be assigned once. (For objects, it means the reference can't change β€” the object's internals can still mutate.)
  • final method β€” cannot be overridden by subclasses.
  • final class β€” cannot be extended. String, Integer, LocalDate are all final.
final List<String> names = new ArrayList<>() does NOT make the list immutable. You can still names.add("..."). The reference is final; the list contents are not. For a truly read-only list, use List.copyOf(names) or Collections.unmodifiableList(names).

How to build a truly immutable class

  1. Mark the class final (so no one can subclass and add mutability).
  2. Mark all fields private final.
  3. No setters. Initialize everything in the constructor.
  4. If a field is itself a mutable object (e.g., a Date or List), defensive copy on the way in (in the constructor) and on the way out (in the getter).
Immutable class β€” done right
public final class Order {
    private final String id;
    private final List<String> items;

    public Order(String id, List<String> items) {
        this.id = id;
        this.items = List.copyOf(items);   // defensive copy + immutable
    }

    public List<String> getItems() {
        return items;   // already unmodifiable, safe to return
    }
}

Why immutability matters

  • Thread safety for free. No locks needed; the object can never be in an inconsistent state.
  • Safe to use as a HashMap key. hashCode never changes mid-lookup.
  • Easier to reason about. No "who mutated this?" debugging sessions.
  • Cacheable. Compute once, reuse forever β€” String caches its hashCode.
Records (Java 14+) give you immutability with one line: record Order(String id, List<String> items) {}. They auto-generate constructor, getters, equals, hashCode, toString. Defensive copy still requires a compact constructor, though.
11

Threads β€” The Basics, Told Through a Restaurant Kitchen

Sarah's coffee shop expands. One barista can't keep up. She hires three more. Now four baristas (threads) work in the same kitchen (process), sharing the same espresso machine (memory). Most of the time it's fine β€” until two reach for the same coffee bean jar at the exact same instant.

Thread vs Process

  • Process β€” an independent program with its own memory space. Two processes can't see each other's variables.
  • Thread β€” a unit of work inside a process. All threads in the same process share the heap (objects, static fields), but each has its own stack (local variables).

Three ways to start a thread

All three styles
// 1. Extend Thread (rarely the right choice)
class Worker extends Thread {
    public void run() { System.out.println("running"); }
}
new Worker().start();

// 2. Implement Runnable (preferred β€” you can still extend something else)
Runnable task = () -> System.out.println("running");
new Thread(task).start();

// 3. Submit to an Executor (the modern way β€” see section 13)
ExecutorService pool = Executors.newFixedThreadPool(4);
pool.submit(task);
Never call thread.run() directly. That just runs the code on the current thread synchronously. start() is what tells the JVM to actually create a new thread.

Thread lifecycle

  • NEW β€” created but not started.
  • RUNNABLE β€” eligible to run (the OS scheduler picks when).
  • BLOCKED β€” waiting for a monitor lock (e.g., entering a synchronized block held by another thread).
  • WAITING / TIMED_WAITING β€” waiting for another thread (Object.wait(), Thread.join(), Thread.sleep()).
  • TERMINATED β€” finished or threw an uncaught exception.
Threads are like cooks in a kitchen. NEW = standing outside the door. RUNNABLE = in the kitchen, doing work or waiting for a turn at the stove. BLOCKED = the freezer is locked and someone else has the key. WAITING = sitting on a chair until a teammate calls them. TERMINATED = clocked out for the day.
12

synchronized and volatile β€” The Two Keywords Every Java Dev Must Know

What's the difference between synchronized and volatile? When would you use each?

The problem they solve

Modern CPUs have multiple cores, each with its own cache. When thread A on Core 1 writes to a variable, that write may sit in Core 1's cache for a while before reaching main memory. Thread B on Core 2 reading the same variable might see a stale value. Worse, the compiler and CPU can reorder instructions for performance, breaking your assumptions about what runs first. synchronized and volatile are how Java tells the JVM "stop being clever here."

synchronized β€” mutual exclusion + memory visibility

Wraps a block in a monitor lock. Only one thread can hold the lock at a time; others block. Critically, entering and exiting a synchronized block also flushes the thread's CPU caches to/from main memory.

synchronized β€” the two ways
class Counter {
    private int count = 0;

    // Method-level β€” locks on `this`
    public synchronized void increment() { count++; }

    // Block-level β€” lock on a specific object (more flexible)
    private final Object lock = new Object();
    public void incrementSafe() {
        synchronized (lock) {
            count++;
        }
    }
}

volatile β€” visibility, NOT mutual exclusion

Marks a variable so every read goes to main memory and every write is flushed immediately. No locking. Threads always see the latest value, but multiple threads can still race on it.

volatile β€” the canonical use case
class Worker implements Runnable {
    private volatile boolean running = true;

    public void run() {
        while (running) { /* work */ }
    }

    public void stop() { running = false; }   // other thread sees this immediately
}
Imagine a whiteboard in a kitchen. volatile is like saying "always read the whiteboard, never trust your memory." synchronized is "lock the whiteboard room β€” only one cook in at a time, and when they leave, everyone else's notes are updated."

When to use which

NeedUse
Read-only flag updated from another threadvolatile
Read-modify-write (count++, list.add)synchronized or AtomicXxx
Compound action across multiple fieldssynchronized
Single counter / single reference, lock-freeAtomicInteger / AtomicReference
volatile on count++ does NOT make it thread-safe. count++ is read-modify-write β€” three operations. Two threads can both read 5, both write 6, and you've lost an increment. Use AtomicInteger.incrementAndGet().
volatile = visibility only. synchronized = visibility + atomicity (mutual exclusion). When in doubt, synchronized.
13

Executors and Thread Pools β€” Don't Hire a New Cook for Every Order

Imagine Sarah's shop hires a new barista every time a customer walks in, then fires them after one drink. Insane, right? Yet that's what new Thread(task).start() for every request does β€” creating a thread costs ~1 MB of memory and milliseconds of OS overhead. ExecutorService is the staffing agency that maintains a pool of standing-by baristas.

The four common pools

Factory methodBehaviorUse case
newFixedThreadPool(n)n threads, unbounded queueSteady load, known concurrency
newCachedThreadPool()Unbounded threads, threads die after 60s idleMany short-lived tasks, bursty
newSingleThreadExecutor()1 thread, sequential executionOrder-dependent tasks (logger, sequencer)
newScheduledThreadPool(n)Delayed/periodic tasksCron-style jobs

Submit and wait

Future and CompletableFuture
ExecutorService pool = Executors.newFixedThreadPool(4);

// Submit returns a Future β€” the task's "claim ticket"
Future<String> future = pool.submit(() -> {
    Thread.sleep(1000);
    return "done";
});

String result = future.get();   // blocks until the task finishes

// Modern: CompletableFuture β€” chainable, non-blocking
CompletableFuture.supplyAsync(() -> fetchUser(42), pool)
    .thenApply(user -> user.getName())
    .thenAccept(name -> System.out.println(name))
    .exceptionally(ex -> { ex.printStackTrace(); return null; });

pool.shutdown();   // always shutdown β€” else JVM won't exit
Executors.newCachedThreadPool() can create unlimited threads β†’ if your tasks block (e.g., on slow I/O), you can run out of memory. Prefer new ThreadPoolExecutor(...) with explicit bounded queue + rejection policy in production.

Virtual Threads (Java 21+)

Lightweight threads managed by the JVM, not the OS. Cost: ~few KB. You can spin up millions. Perfect for I/O-bound work where each task spends most of its time waiting on a network call. The "one thread per request" model is back β€” but cheap.

Virtual threads (Java 21+)
try (ExecutorService exec = Executors.newVirtualThreadPerTaskExecutor()) {
    for (int i = 0; i < 10_000; i++) {
        exec.submit(() -> callSlowApi());
    }
}   // AutoCloseable β€” waits for all tasks
14

Beyond synchronized β€” Locks, Atomics, and Concurrent Collections

ReentrantLock β€” synchronized with superpowers

synchronized is simple but rigid. ReentrantLock gives you tryLock (non-blocking attempt), interruptible lock, fair ordering, and multiple condition variables.

ReentrantLock β€” flexible mutex
Lock lock = new ReentrantLock();

// Try to acquire for 500ms β€” give up if it can't
if (lock.tryLock(500, TimeUnit.MILLISECONDS)) {
    try {
        // critical section
    } finally {
        lock.unlock();   // MUST be in finally β€” else lock leaks forever
    }
}

ReadWriteLock β€” many readers, one writer

If your data is read 100x more often than written, full mutual exclusion is wasteful. ReentrantReadWriteLock lets unlimited readers in concurrently, but writers get exclusive access.

Atomics β€” lock-free counters

AtomicInteger, AtomicLong, AtomicReference use CPU-level CAS (compare-and-swap) instructions. No locks, no blocking β€” just retry-on-conflict at the hardware level.

AtomicInteger
AtomicInteger count = new AtomicInteger();
count.incrementAndGet();    // thread-safe ++ without synchronized
count.compareAndSet(5, 10);  // "if value is 5, set to 10" atomically

Concurrent collections

CollectionWhat's special
ConcurrentHashMapPer-bucket locks. Reads are lock-free. Writes only contend on the same bucket.
CopyOnWriteArrayListEvery write creates a new copy. Reads are lock-free and very fast. Use only when reads dominate writes massively.
BlockingQueue (ArrayBlockingQueue, LinkedBlockingQueue)Producer-consumer pattern. put() blocks if full, take() blocks if empty.
ConcurrentLinkedQueueLock-free FIFO queue (Michael-Scott algorithm).
Think of ConcurrentHashMap as a parking garage with separate gates per row. Pre-Java 8 it had ~16 gates (segments). Java 8 onwards, every row has its own little gate (CAS). Two cars heading to different rows never wait.
Default to ConcurrentHashMap over Collections.synchronizedMap() β€” the latter wraps every operation in a single lock, which kills concurrency.
15

JVM Memory Model β€” Where Does Your Object Actually Live?

Devansh writes Person p = new Person("Sarah"). He's been told "objects go on the heap, primitives on the stack." But which stack? Where in the heap? And what is this Metaspace thing? Let's open the JVM and look inside.

The five memory areas

  • Heap β€” shared by all threads. All objects (everything created with new) live here. Subdivided into Young Gen (Eden + two Survivor spaces) and Old Gen.
  • Stack β€” one per thread. Holds method frames: each frame contains local variables and the return address. Primitives and object references (NOT the objects themselves) live here.
  • Metaspace (Java 8+; replaced PermGen) β€” class metadata, method bytecode, runtime constant pool. Native memory, grows dynamically.
  • PC Register β€” one per thread. Holds the address of the current bytecode instruction.
  • Native Method Stack β€” for JNI / native calls.
Picture a hotel. The heap is the giant shared lobby where all the actual furniture (objects) sits. Each thread is a guest with their own private notepad (stack) β€” they jot down where in the lobby their stuff is (references). The metaspace is the hotel's manual, listing what types of furniture exist.

Stack vs Heap β€” a concrete example

Where does what go?
void checkOut() {
    int total = 100;                       // primitive β€” on this thread's STACK
    String name = "Sarah";                  // reference on STACK, "Sarah" String on HEAP (in pool)
    Order order = new Order(42, name);     // reference on STACK, Order object on HEAP
}   // stack frame discarded β€” heap objects live until GC

Young vs Old generation

The heap has two main zones:

  • Young Generation β€” where new objects are born (specifically in Eden). Most objects die young (the "weak generational hypothesis"). Young GC is fast and frequent.
  • Old Generation β€” objects that survive several Young GC cycles get promoted here. Long-lived objects (caches, singletons). Old GC is slower but rarer.
StackOverflowError = stack ran out (usually unbounded recursion). OutOfMemoryError: Java heap space = heap is full. Different problems, different fixes β€” increase -Xss for stack, -Xmx for heap.
16

Garbage Collection β€” Java's Janitor

The GC's job is to find objects no one is using anymore and reclaim their memory. The how and when has evolved dramatically β€” knowing modern GCs (G1, ZGC, Shenandoah) is a strong signal in interviews.

What does "no one is using" mean?

The GC walks from a set of GC roots (live thread stacks, static fields, JNI references) and marks every object it can reach. Anything not reached is unreachable β†’ garbage β†’ freed.

Imagine the GC standing at the entrance of a maze. It follows every path, painting each room green. When done, any room not painted green is empty and gets demolished. That's "mark-and-sweep."

Generational hypothesis β€” the key insight

Empirically, most objects die young. A request handler creates 100 short-lived objects, returns, and they're all garbage. Why scan the whole heap when 99% of garbage is in the young area? Modern GCs split the heap into Young + Old and run different algorithms on each.

GC algorithms β€” the modern lineup

GCPause timeBest for
SerialStops the world. Single-threaded.Tiny apps, embedded
Parallel (Throughput)Stops the world. Multi-threaded.Batch jobs β€” max throughput, pauses ok
G1 (default since Java 9)Tries to hit a target pause (e.g., 200ms). Region-based.Most server apps with multi-GB heap
ZGC / Shenandoah<10ms pauses, even on 100GB+ heapsLatency-critical, large heap apps

Stop-the-world (STW)

For some GC phases, all application threads must pause. This is "stop-the-world." It's why a 16 GB heap full of long-lived objects can cause noticeable lag spikes. Modern GCs (G1, ZGC) minimize STW pauses by doing most work concurrently with the application.

When you can't be GC'd

Common causes of memory leaks in Java (yes, leaks exist despite GC):

  • Static collections that grow forever β€” a static HashMap that you never evict from.
  • Unclosed listeners / callbacks β€” registered but never deregistered. The framework keeps a strong reference to your object.
  • ThreadLocals not removed β€” a thread in a pool retains its ThreadLocal entry across requests.
  • Caches without size limits β€” use WeakHashMap or a real cache library (Caffeine).
If asked "explain GC," structure it as: (1) what's garbage, (2) generational hypothesis, (3) name the algorithm you've used (G1 by default), (4) STW trade-off. Bonus: mention ZGC for sub-10ms pauses on huge heaps.
17

ClassLoaders β€” Who Brings Your Classes In?

When you run java -cp myapp.jar com.example.Main, who actually loads Main.class into memory? It's not magic β€” it's a chain of ClassLoaders, each with its own job and its own search path.

The classic three-tier hierarchy

  • Bootstrap ClassLoader β€” written in C++, part of the JVM itself. Loads the core JDK classes (java.lang.*, java.util.*). Pre-Java 9 these came from rt.jar; post-Java 9 from JRT modules.
  • Platform (Extension) ClassLoader β€” loads JDK extension modules. A child of bootstrap.
  • Application (System) ClassLoader β€” loads classes from your -cp classpath. A child of platform. This is the one that loads your code.

The delegation model

When asked to load class X, a ClassLoader first asks its parent ("can you load X?"). Only if the parent can't does it try locally. This walks all the way up to bootstrap before any child tries.

Picture a chain of librarians. You ask the junior librarian (Application) for "java.lang.String." She first asks her boss (Platform). Boss asks her boss (Bootstrap). Bootstrap finds it in the core JDK shelf and hands it down the chain. This prevents you from accidentally substituting a malicious java.lang.String.

Why does this matter?

  • Class identity = (class name, ClassLoader). Two different ClassLoaders can load the same class name and the JVM treats them as different types. Cast between them β†’ ClassCastException.
  • Hot reloading β€” frameworks like Spring DevTools, Tomcat, and IDEs use multiple ClassLoaders so they can swap class versions without restarting the JVM.
  • Plugin systems β€” each plugin gets its own ClassLoader, isolated from others.
Frameworks sometimes break delegation (Tomcat does β€” it loads webapp classes first from the WAR, then delegates). This lets webapps ship their own version of a library, but causes "ClassCastException: com.foo.Bar cannot be cast to com.foo.Bar" when types cross ClassLoader boundaries.
18

Streams & Functional Java β€” Pipelines, Lazy Evaluation, the Whole Story

Mira has a list of orders. The old way: a 30-line for-loop with nested ifs to find the top 5 customers by spend in the last week. The new way (Java 8+): a 5-line stream. Let's understand why the new way is better, and what's actually happening underneath.

What is a stream?

A Stream is a sequence of elements supporting declarative operations like filter, map, reduce. It's NOT a data structure β€” it doesn't store anything. It's a pipeline that lazily processes elements from a source.

A typical pipeline
List<Order> orders = /* ... */;

Map<String, Double> topCustomers = orders.stream()
    .filter(o -> o.getDate().isAfter(LocalDate.now().minusDays(7)))
    .collect(Collectors.groupingBy(Order::getCustomer,
                                Collectors.summingDouble(Order::getAmount)));

Three pieces of every stream

  1. Source β€” collection, array, I/O channel, generator. Where elements come from.
  2. Intermediate operations β€” filter, map, flatMap, sorted, distinct. Lazy β€” they describe work, don't do it.
  3. Terminal operation β€” collect, forEach, reduce, count. Triggers the actual computation.
A stream pipeline is like an assembly line. The source is the conveyor belt feeding raw items. Each intermediate operation is a station that transforms or rejects items. The terminal operation is the box at the end that catches the output. Until the box is in place, the conveyor doesn't move β€” that's laziness.

Lazy evaluation β€” the key superpower

Intermediate ops don't run until a terminal op pulls. This means short-circuiting: findFirst() only processes elements until it finds one. limit(10) stops after ten. Streams over infinite sources (Stream.iterate, Stream.generate) work because of laziness.

Parallel streams

Add .parallel() and the JVM splits the work across the common ForkJoinPool. Sounds magical β€” and is dangerous if abused.

Parallel streams use the SHARED common ForkJoinPool. If your task is I/O-bound or you call them from multiple places, threads contend. Also, mutating shared state inside a parallel stream (e.g., list.add) is a race condition. Rule: parallel only for CPU-heavy, stateless, large-N work.

Functional interfaces β€” the building blocks

InterfaceSignatureWhen to use
Function<T, R>R apply(T)Transform: map
Predicate<T>boolean test(T)Filter
Consumer<T>void accept(T)Side effect: forEach
Supplier<T>T get()Lazy value, factory
BiFunction<T, U, R>R apply(T, U)Two-arg transform: reduce accumulator
Don't force everything into streams. A simple for-loop is often clearer for 5 lines of imperative code. Streams shine for declarative transformations β€” filter / map / reduce / group β€” where the loop version would have nested conditionals and accumulator variables.
19

Optional β€” Use It Right or Don't Use It

Optional was added in Java 8 to express "this might be absent." The community immediately misused it everywhere. Here's how to use it the way Brian Goetz (Java's chief language architect) recommends.

What it's for

Optional exists to make "no value" explicit in return types. A method returning Optional<User> tells the caller, "I might not find one β€” handle that case."

The right way
public Optional<User> findById(String id) {
    return Optional.ofNullable(userMap.get(id));
}

// Caller is forced to handle absence:
String name = findById("u1")
    .map(User::getName)
    .orElse("Unknown");

What it's NOT for

  • Fields β€” don't make Optional<Address> address a field. Use null directly, or split into two classes. Optional doesn't serialize well and adds memory overhead.
  • Method parameters β€” overloads or just allowing null are simpler.
  • Collection elements β€” List<Optional<User>> is silly. An empty list is the absence.
  • Direct .get() without isPresent β€” defeats the entire purpose. If you're going to call .get() blindly, you've replaced NullPointerException with NoSuchElementException for no benefit.
Optional is like a small box that may contain a gift or be empty. The recipient has to open it carefully. Wrapping every variable in your house in such a box (fields, parameters, list elements) just makes life annoying for everyone.
Optional is NOT a substitute for null everywhere. It's a tool for one specific signaling problem: "this query might return nothing." Use it surgically.
20

The Tricky Gotchas β€” Questions That Separate Mid from Senior

These are the questions where interviewers smile when you nail them β€” they reveal whether you've actually shipped Java in production or just memorized a textbook.

1. finally runs even after return

What does this print?
int tricky() {
    try {
        return 1;
    } finally {
        System.out.println("finally");   // PRINTS "finally"
        // return 2;  // BAD β€” would override return 1
    }
}
// Output: "finally", returns 1.

2. Autoboxing in collections

list.remove(2) on a List<Integer> β€” does it remove the element at index 2 or the element with value 2?

Index 2! Because List has both remove(int index) and remove(Object o), and the primitive int matches the index version. To remove by value: list.remove(Integer.valueOf(2)).

3. Static method "overriding"

Static methods cannot be overridden β€” only hidden. A child class declaring static foo() doesn't override the parent's static foo(); it shadows it. Calls resolve at compile time based on the reference type, not the runtime object.

4. String.intern() moves a string into the pool

Useful for deduplication when reading millions of strings from a file. But: don't intern user input β€” the pool is a permanent area (well, GC'd in modern JVMs but expensive), and an attacker filling it with garbage = denial of service.

5. The diamond problem with default methods

Java 8 allowed interfaces to have default methods. What if a class implements two interfaces with the same default method? You MUST override and pick: InterfaceA.super.method();

6. Constructor of an inner class secretly captures the outer

A non-static inner class holds an implicit reference to its enclosing instance. If the inner is long-lived (e.g., stored in a static map or sent to an executor), the outer can't be GC'd β†’ memory leak. Solution: use a static nested class when no enclosing reference is needed.

7. HashMap iteration order is not insertion order

Insertion order is not preserved in HashMap. If you need it, use LinkedHashMap. If you need sorted order, use TreeMap.

8. SimpleDateFormat is NOT thread-safe

The classic 2009-era bug. Sharing a SimpleDateFormat across threads silently corrupts dates. Use DateTimeFormatter (Java 8+) β€” it's immutable and thread-safe.

9. equals on arrays compares references, not contents

arr1.equals(arr2) is arr1 == arr2. Use Arrays.equals(arr1, arr2) for element-wise comparison, Arrays.deepEquals for nested arrays.

10. Integer i = null; int x = i; throws NullPointerException

Auto-unboxing a null wrapper throws NPE β€” a classic source of "where did this NPE come from?" debugging. Always null-check wrappers before auto-unboxing.

These aren't trivia β€” every one of them has bitten production code at companies you've heard of. If you can rattle off five of these in an interview, you've moved from "I learned Java" to "I shipped Java."
When you don't know an answer, say so. Then reason out loud about what it could be β€” interviewers value clear thinking far more than perfect recall.

Did this guide make Java click? If it helped, tap the ❀️ β€” that's how I know it landed.