(DSA + HashMap/ConcurrentHashMap Internals + Polymorphism + Best Answers + Code + Learning Roadmap)
This blog is written as a complete interview-prep note based on the questions you described (DSA “move zeros in-place”, HashMap / ConcurrentHashMap internals, equals/hashCode, and polymorphism + overload resolution). It’s designed so freshers and experienced devs can both use it as a “ready-to-speak” guide.
1) Typical Interview Flow (What they were testing)
1. Self introduction (“Tell me about yourself”)
What they check:
Can you communicate clearly?
Can you summarize your work/learning?
Do you understand your own projects?
Expected answer (simple):
Name + current role/experience
Tech stack
1–2 projects + impact
Why this role/company
Good answer:
Adds measurable impact (“reduced API latency by 30%”, “handled 10k RPM”, etc.)
Mentions 1 challenge + how you solved it
Best answer (interviewer loves this):
60–90 seconds
Clear story: “I’m a Java backend developer… I built X… I enjoy Y… I’m now looking for Z…”
Ends with a bridge: “That’s why this role matches me: Java + concurrency + problem solving.”
2) DSA Problem: Move Zeros In‑Place (Order may / may not be required)
You gave this input:
{1,0,2,0,5,0,0,3,0,-2,-1}
And one possible output shown was:
{1,2,5,3,-2,-1,0,0,0,0,0}
That output preserves the relative order of non‑zero elements (stable).
But you later clarified an important twist:
“It should be in-place… and sequence should not be maintained.”
That means the interviewer may have been okay with an unstable solution (faster in practice, fewer writes).
The best move in interviews
Always ask one clarifying question:
“Should I preserve the order of non-zero elements, or any order is fine?”
That instantly shows maturity.
2.1 Expected Answer (Stable, preserves order of non‑zeros)
Approach: write all non-zeros forward, then fill remaining with zeros.
✅ Pros: stable, simple
✅ Time: O(n)
✅ Space: O(1)
⚠️ Writes: can be ~n (still fine)
Java Code (Stable)
import java.util.Arrays;
public class MoveZerosStable {
public static void moveZerosStable(int[] arr) {
int write = 0;
// Move non-zeros forward (keeps order)
for (int x : arr) {
if (x != 0) arr[write++] = x;
}
// Fill rest with zeros
while (write < arr.length) {
arr[write++] = 0;
}
}
public static void main(String[] args) {
int[] arr = {1,0,2,0,5,0,0,3,0,-2,-1};
moveZerosStable(arr);
System.out.println(Arrays.toString(arr));
// [1, 2, 5, 3, -2, -1, 0, 0, 0, 0, 0]
}
}
Expected explanation in interview:
“I compact non-zeros using a write pointer.”
“Then I fill remaining positions with zeros.”
“O(n) time, O(1) space.”
2.2 Best Answer (Unstable, order NOT required, fewer writes)
If order does not matter, you can do a two‑pointer swap:
ifrom startjfrom endIf
arr[i] == 0, swap witharr[j]and decrementjOtherwise increment
iAlso skip trailing zeros at
j
✅ Pros: in-place, single pass feel, fewer writes
✅ Time: O(n)
✅ Space: O(1)
⚠️ Order changes (which is allowed in this variant)
Java Code (Unstable)
import java.util.Arrays;
public class MoveZerosUnstable {
public static void moveZerosUnstable(int[] arr) {
int i = 0, j = arr.length - 1;
while (i < j) {
// Move j left past zeros
while (i < j && arr[j] == 0) j--;
if (arr[i] == 0) {
// swap arr[i] and arr[j]
int temp = arr[i];
arr[i] = arr[j];
arr[j] = temp;
j--;
} else {
i++;
}
}
}
public static void main(String[] args) {
int[] arr = {1,0,2,0,5,0,0,3,0,-2,-1};
moveZerosUnstable(arr);
System.out.println(Arrays.toString(arr));
// One valid output example (order may differ):
// [1, -1, 2, -2, 5, 3, 0, 0, 0, 0, 0]
}
}
Best explanation:
“If order isn’t required, I’ll swap zeros with the last non-zero from the end.”
“Still O(n), but fewer writes and simpler than shifting.”
2.3 Common Follow‑Up Variant (If they bring “negative numbers to other side”)
Sometimes interviewers extend it into a 3‑way partition:
negatives on one side
zeros in the middle
positives on the other side
This is basically the Dutch National Flag pattern.
Java Code (Negatives | Zeros | Positives) — Unstable
import java.util.Arrays;
public class ThreeWayPartition {
public static void partitionNegZeroPos(int[] arr) {
int low = 0, mid = 0, high = arr.length - 1;
while (mid <= high) {
if (arr[mid] < 0) {
swap(arr, low++, mid++);
} else if (arr[mid] == 0) {
mid++;
} else { // > 0
swap(arr, mid, high--);
}
}
}
private static void swap(int[] arr, int i, int j) {
int t = arr[i];
arr[i] = arr[j];
arr[j] = t;
}
public static void main(String[] args) {
int[] arr = {1,0,2,0,5,0,0,3,0,-2,-1};
partitionNegZeroPos(arr);
System.out.println(Arrays.toString(arr));
// Example: [-2, -1, 0, 0, 0, 0, 0, 3, 5, 2, 1]
}
}
3) HashMap Internals (What interviewers expect you to know)
3.1 What HashMap is (best speaking answer)
“HashMap is a hash-table based Map implementation. It stores entries in an internal bucket array, uses
hashCode()+ equality checks to locate keys, allows null key/value, is not synchronized, and does not guarantee ordering.” (Oracle Docs)
Oracle explicitly notes:
Permits null keys/values
Not synchronized
Has initial capacity + load factor (default 0.75) and resizes when threshold is exceeded (Oracle Docs)
3.2 equals(), hashCode(), hash value — how to explain clearly
The “contract” (must say)
If
a.equals(b)is true →a.hashCode() == b.hashCode()must be true.Same hashCode does not mean equals is true (collisions exist).
How HashMap uses them:
Compute
hashCode()Map it to a bucket index (implementation detail)
If multiple keys land in same bucket, compare:
hash (quick check)
then
equals()(final check)
If equals matches → update value; else add new node
Best warning example (interviewer loves it):
“If you override
equals()but nothashCode(), HashMap lookups can fail or behave incorrectly.”
3.3 Collisions: linked list → tree (Java 8+)
When collisions become frequent, HashMap can convert a bucket from a linked structure to a balanced tree to improve performance. That design was introduced to handle frequent collisions more efficiently. (OpenJDK)
Interview line:
“Worst-case lookup becomes closer to O(log n) in heavily-collided buckets due to tree bins.” (OpenJDK)
3.4 Resizing (rehash) — what to mention
HashMap resizes when
size > capacity * loadFactorDefault load factor is 0.75
Resizing is expensive, so if you know approximate size, pass initial capacity (Oracle Docs)
3.5 Fail-fast iterators
HashMap iterators are fail-fast (they can throw ConcurrentModificationException if structurally modified during iteration). This is documented behavior. (Oracle Docs)
4) ConcurrentHashMap (Need + internal behavior + best explanation)
4.1 Why ConcurrentHashMap is needed
HashMap is not thread-safe. In concurrent updates you can get:
Lost updates
Data races
Corrupted internal structure
So ConcurrentHashMap provides a thread-safe Map with high throughput.
4.2 The most important line to say (from Java docs)
“Retrieval operations do not entail locking.” (Oracle Docs)
That means:
get()is designed to be fast under concurrencyreads can overlap with writes safely
Also:
It does not support locking the entire table to block all access (no global “stop the world” table lock). (Oracle Docs)
4.3 Null handling difference
ConcurrentHashMap does not allow null keys or values (unlike HashMap). (Oracle Docs)
Best explanation:
In CHM,
nullis useful as a special meaning: “no mapping present”, so allowing null would break atomic methods and concurrent semantics.
4.4 Segment locking vs modern Java
You told them “segments + segment locking” — that was correct historically (older CHM designs), but interviewers today usually expect the Java 8+ design (more fine-grained, not segment-based).
Best safe way to answer in interview:
“Earlier versions used segmented locking. Modern implementations focus on lock-free reads and fine-grained coordination for updates, avoiding a single global lock.” (Oracle Docs)
(That phrasing is accurate and doesn’t trap you into version-specific internals.)
4.5 How CHM gives good reads + good writes
Reads (get):
No lock for retrieval (Oracle Docs)
Uses memory visibility guarantees (happens-before behavior) documented for updates and subsequent reads (Oracle Docs)
Writes (put/update):
Thread-safe updates
Avoids locking entire map
Supports atomic compound ops:
computeIfAbsent,putIfAbsent, etc.
Java docs even highlight scalable frequency-map usage with LongAdder and computeIfAbsent: (Oracle Docs)
Example:
freqs.computeIfAbsent(key, k -> new LongAdder()).increment();
(That exact idea is from the docs.) (Oracle Docs)
5) Polymorphism Deep Dive (what they were really asking)
Polymorphism = “same call, different behavior”.
5.1 Types of polymorphism you should say
Compile-time polymorphism → Method overloading
Runtime polymorphism → Method overriding
Parametric polymorphism → Generics (often a bonus mention)
6) The int/float/double overload question (compiler decision)
They asked something like:
“If you have methods with int, float, double… which gets called?”
This is compile-time method overload resolution.
6.1 Key rule
Overloading is resolved by the compiler using the most specific applicable method rule. (Oracle Docs)
6.2 Practical rules (in the order compiler prefers)
Exact match
Widening primitive conversion (int → long → float → double)
Boxing (int → Integer)
Varargs (last resort)
6.3 Example you can speak in interview
void f(int x) {}
void f(float x) {}
void f(double x) {}
f(10); // calls f(int) - exact
f(10.0f); // calls f(float) - exact
f(10.0); // calls f(double) - exact
Best explanation line:
“The compiler picks the most specific method available; it prefers exact match, then widening, then boxing, then varargs.” (Oracle Docs)
7) “Expected vs Good vs Best” Answers (Interview-ready scripts)
Q1: “Explain HashMap internal working”
Expected:
Buckets + hashing + collisions
Good:
Mentions load factor + resizing, not thread-safe, allows null
Best:
Adds collision-to-tree optimization and fail-fast iterators, and explains equals/hashCode contract
(and quotes: not synchronized + null allowed + ordering not guaranteed) (Oracle Docs)
Q2: “equals vs hashCode”
Expected:
If equals true then hashCode same
Good:
Explains collision possibility
Best:
Gives bug scenario: override equals only → HashMap lookups fail
Q3: “Why ConcurrentHashMap? How it works?”
Expected:
Thread-safe map, better than synchronized HashMap
Good:
Mentions better concurrency than Hashtable, no global lock
Best:
Says: “Reads don’t lock” (Oracle Docs)
Says: “No null keys/values” (Oracle Docs)
Mentions atomic operations like
computeIfAbsent(and the LongAdder pattern) (Oracle Docs)
Q4: “Explain polymorphism”
Expected:
Overloading + overriding
Good:
Compile-time vs runtime
Best:
Adds: “Overloading is compile-time (compiler picks most specific). Overriding is runtime dispatch based on actual object.” (Oracle Docs)
8) Mini “Source Code Pack” You Can Keep in Your Notes
A) Move zeros (stable)
(Already given above)
B) Move zeros (unstable, best when order not required)
(Already given above)
C) equals/hashCode demo (HashMap bug)
import java.util.*;
class User {
int id;
User(int id) { this.id = id; }
@Override public boolean equals(Object o) {
if (!(o instanceof User)) return false;
return this.id == ((User)o).id;
}
// Uncomment this to fix it
// @Override public int hashCode() { return Integer.hashCode(id); }
}
public class EqualsHashCodeBug {
public static void main(String[] args) {
Map<User, String> map = new HashMap<>();
map.put(new User(7), "Alice");
System.out.println(map.get(new User(7))); // null if hashCode not overridden
}
}
9) What to study next (Roadmap + Courses)
Track A (Freshers)
Java basics + OOP + Collections
DSA patterns (arrays, two pointers, hashing, sliding window)
Core concurrency basics
SQL + REST basics
Mini backend project
Track B (Experienced)
Deep collections internals (HashMap/CHM)
Java Memory Model basics (visibility, happens-before)
Threading + executors + locks
System design basics (API design, caching, DB indexing)
Performance thinking (latency vs throughput)
Recommended Courses (Practical + Recognized)
Java Foundation
Java Programming and Software Engineering Fundamentals (Duke / Coursera) (Coursera)
DSA in Java
Algorithms, Part I (Princeton / Coursera) (Coursera)
(Strong for fundamentals like sorting/searching/data structures with performance analysis.)
Coding Interview Patterns
Grokking the Coding Interview Patterns (Educative) (Educative)
Java Multithreading / Concurrency
Java Multithreading, Concurrency & Performance Optimization (Udemy) (Udemy)
(Very aligned with “volatile / concurrency / parallelism / performance” discussions.)
10) Final “Best Interview Tip” for This Exact Set of Questions
When they ask something broad like HashMap or polymorphism:
✅ Start with a clean high-level explanation (10–15 seconds)
✅ Then go 1 level deeper (internals / rules / trade-offs)
✅ Then give one example (tiny code or scenario)
✅ End with why it matters (performance, correctness, concurrency)
That’s how you sound “senior” even as a fresher.
If you want, I can also create:
a ready-to-post Medium/LinkedIn version (cleaner formatting + story tone),
a 1-page cheat sheet PDF of “best answers” (HashMap/CHM/Polymorphism + code patterns),
a mock Q&A script (interviewer asks, you answer) based exactly on these topics.