Build custom data structures that combine hash maps, linked lists, and stacks to achieve O(1) operations for complex behaviors.
Design pattern problems ask you to build a custom data structure from scratch that supports a specific set of operations, all in optimal time. The trick is to compose two or more primitive structures (hash map, linked list, stack, array) so their strengths cover each other's weaknesses.
No single data structure gives you O(1) lookup, O(1) insertion, O(1) deletion, and ordering. But if you combine two structures—one for fast lookup (hash map) and one for ordering (doubly-linked list, stack)—you get both.
Why composites? A hash map gives O(1) get/put by key but has no ordering. A linked list gives O(1) insert/remove at known positions but has no random access. Combining them gives O(1) for every operation: look up the node in the map, then move it in the list.
Look for these recognition signals in the problem statement. If you spot one, you are likely facing a data structure design problem.
"Design a data structure" or "Implement class"
"O(1) time for all operations"
"Least recently used" or "eviction policy"
"Get minimum/maximum in O(1)"
The key clue: The problem requires multiple operations (lookup + ordering, push + getMin, insert + getRandom) each in O(1). Ask yourself: "Which two structures, when combined, cover every operation?" That composite is the answer.
The LRU Cache supports get(key) and put(key, value) in O(1). A hash map stores key → node pointers. A doubly-linked list orders nodes from most recently used (head) to least recently used (tail). On every access, we move the node to the head. On eviction, we remove the tail.
Operations: put(1,A), put(2,B), get(1), put(3,C)
Sentinel nodes matter. Using dummy HEAD and TAIL sentinel nodes eliminates all null-checking edge cases. Every real node is always between two other nodes, so remove and insert operations never need special-case logic for empty lists or boundary nodes.
The Min Stack supports push, pop, top, and getMin all in O(1). The key insight: maintain a second stack (or store pairs) that tracks the running minimum at each level. When we push, we also push the new min. When we pop, we also pop the min.
Two stacks side by side: the main stack and the min stack.
Choosing the variant: If the problem needs fast lookup + ordering (LRU, LFU, time-based caches), use hash map + linked list. If the problem needs a standard container with an extra aggregate query (getMin, getMax, getRandom), use augmentation—bolting a parallel structure alongside the main one.
Here are the annotated Java templates for both core variants. Each method is O(1).
class LRUCache { private Map<Integer, Node> map = new HashMap<>(); private Node head, tail; // sentinel nodes private int capacity; public LRUCache(int capacity) { this.capacity = capacity; head = new Node(0, 0); tail = new Node(0, 0); head.next = tail; // head ↔ tail tail.prev = head; } public int get(int key) { if (!map.containsKey(key)) return -1; Node node = map.get(key); remove(node); // detach from current position insertHead(node); // move to front (most recent) return node.val; } public void put(int key, int val) { if (map.containsKey(key)) remove(map.get(key)); Node node = new Node(key, val); map.put(key, node); insertHead(node); if (map.size() > capacity) { // over capacity: evict LRU Node lru = tail.prev; // node just before tail sentinel remove(lru); map.remove(lru.key); } } private void remove(Node n) { // O(1) unlink n.prev.next = n.next; n.next.prev = n.prev; } private void insertHead(Node n) { // O(1) insert after head sentinel n.next = head.next; n.prev = head; head.next.prev = n; head.next = n; } }
class MinStack { private Deque<Integer> stack = new ArrayDeque<>(); private Deque<Integer> minStack = new ArrayDeque<>(); public void push(int val) { stack.push(val); int curMin = minStack.isEmpty() ? val : Math.min(val, minStack.peek()); minStack.push(curMin); // track min at this level } public void pop() { stack.pop(); minStack.pop(); // always pop in sync } public int top() { return stack.peek(); } public int getMin() { return minStack.peek(); } // O(1) }
Sentinel nodes eliminate edge cases. In the LRU Cache, the HEAD and TAIL sentinels mean the list is never truly empty. You never check if (head == null). The remove and insertHead helpers each have exactly 4 pointer assignments—no branches, no special cases.
Let us trace through a full LRU Cache scenario step by step, showing the hash map and linked list side by side at every stage.
Operations: put(1,10), put(2,20), put(3,30), get(2), put(4,40)
| OPERATION | ACTION | LIST (HEAD→TAIL) | MAP SIZE |
|---|---|---|---|
| put(1,10) | Insert at head | [1:10] | 1/3 |
| put(2,20) | Insert at head | [2:20] ↔ [1:10] | 2/3 |
| put(3,30) | Insert at head | [3:30] ↔ [2:20] ↔ [1:10] | 3/3 (full) |
| get(2) | Move to headreturns 20 | [2:20] ↔ [3:30] ↔ [1:10] | 3/3 |
| put(4,40) | Evict key 1, insert 4 at head | [4:40] ↔ [2:20] ↔ [3:30] | 3/3 |
Notice the pattern: Every get and put touches exactly one node in the hash map (O(1) lookup) and performs at most two linked-list operations (remove + insertHead, both O(1) with direct node references). No iteration is ever needed.
These are the classic LeetCode problems that use the data structure design pattern, listed in rough order of difficulty.
Practice tip: Start with #155 Min Stack (direct two-stack pattern) and #705/#706 (understand hash internals). Then tackle #146 LRU Cache (the flagship composite design problem). Once comfortable, try #380 (array + map for O(1) random) and #460 LFU Cache (the ultimate multi-structure composite).
Set a capacity and perform get/put operations on an LRU Cache. Watch the hash map and doubly-linked list update in real time. See eviction happen when the cache exceeds capacity.
LRU Cache: Hash map lookup is O(1). Linked list remove/insert are O(1) because we have a direct pointer to the node (from the map). No traversal is ever needed. Eviction removes the tail's predecessor in O(1).
Min Stack: Each push/pop does exactly two stack operations (one on the main stack, one on the min stack). getMin() just peeks the min stack. All O(1).
Space: Both structures store at most n elements (where n is the capacity for caches, or the number of elements for stacks). The hash map and linked list share the same nodes (not duplicate copies), so space is O(n).
The space-time tradeoff: We use extra space (a second stack or a hash map) specifically to avoid iteration. Every design pattern problem is fundamentally about trading space for time, buying O(1) operations by storing redundant pointers or metadata.
Combine two structures for O(1) everything. No single data structure gives you fast lookup, fast ordering, and fast insert/delete. Composites like hash map + linked list achieve what no solo structure can.
Use sentinel nodes to eliminate edge cases. Dummy head and tail nodes in a doubly-linked list mean the list is never truly empty. Remove and insert operations become simple, branchless 4-pointer reassignments.
Augmentation means "bolt on a parallel tracker." For Min Stack, the min stack mirrors the main stack. For Max Stack, track the max. The pattern generalizes: any aggregate query can be answered in O(1) by maintaining it alongside the primary structure.
Store direct node references in the map. The hash map must point directly to the linked list node (not just the key or value). This is what enables O(1) remove: you skip the traversal and go straight to the node to unlink it.
Identify the operations, then choose the composite. List every required operation and its time constraint. Then ask: "Which primitive structures, when layered together, satisfy all constraints?" LRU = map + list. RandomizedSet = map + array. MinStack = stack + stack. The design follows directly from the requirements.