Pattern

Design Patterns

Build custom data structures that combine hash maps, linked lists, and stacks to achieve O(1) operations for complex behaviors.

Roadmap

  1. What Is This Pattern?
  2. When to Use It
  3. Variant 1 — Hash Map + Linked List (LRU Cache)
  4. Variant 2 — Stack Augmentation (Min Stack)
  5. Template Code
  6. Visual Walkthrough
  7. Common Problems
  8. Interactive Demo
  9. Complexity Analysis
  10. Key Takeaways
Step 01

What Is This Pattern?

Design pattern problems ask you to build a custom data structure from scratch that supports a specific set of operations, all in optimal time. The trick is to compose two or more primitive structures (hash map, linked list, stack, array) so their strengths cover each other's weaknesses.

The Core Idea

No single data structure gives you O(1) lookup, O(1) insertion, O(1) deletion, and ordering. But if you combine two structures—one for fast lookup (hash map) and one for ordering (doubly-linked list, stack)—you get both.

Hash Map + Doubly-Linked List = LRU Cache
HashMap
key: 1 node A
key: 2 node B
key: 3 node C
Doubly-Linked List (MRU → LRU)
HEAD
Av=10
Bv=20
Cv=30
TAIL
most recent least recent (evict first)
💡

Why composites? A hash map gives O(1) get/put by key but has no ordering. A linked list gives O(1) insert/remove at known positions but has no random access. Combining them gives O(1) for every operation: look up the node in the map, then move it in the list.

Step 02

When to Use It

Look for these recognition signals in the problem statement. If you spot one, you are likely facing a data structure design problem.

"Design a data structure" or "Implement class"
The problem explicitly asks you to build a class with specific methods like get, put, push, pop, insert, remove.
"O(1) time for all operations"
This is the hallmark constraint. A single data structure cannot do it; you need a composite.
"Least recently used" or "eviction policy"
Cache problems need ordering by access time plus fast lookup. Hash map + doubly-linked list.
"Get minimum/maximum in O(1)"
Stack augmentation pattern: maintain a parallel stack tracking the current min or max at each level.

The key clue: The problem requires multiple operations (lookup + ordering, push + getMin, insert + getRandom) each in O(1). Ask yourself: "Which two structures, when combined, cover every operation?" That composite is the answer.

Step 03

Variant 1 — Hash Map + Linked List (LRU Cache)

Cache Design

The LRU Cache supports get(key) and put(key, value) in O(1). A hash map stores key → node pointers. A doubly-linked list orders nodes from most recently used (head) to least recently used (tail). On every access, we move the node to the head. On eviction, we remove the tail.

Walkthrough: capacity = 2

Operations: put(1,A), put(2,B), get(1), put(3,C)

put(1, A)
Cache is empty. Create node A, add to head of list, add key 1 to map.
H
A
T
map
1 → A
put(2, B)
Create node B, add to head. A moves toward tail. Size = 2 (at capacity).
H
B
A
T
map
1 → A
2 → B
get(1) → returns A
Key 1 found in map. Move node A to head. A is now most recent, B becomes LRU.
H
A
B
T
map
1 → A
2 → B
put(3, C) — eviction!
Cache full. Evict tail node B (key 2), remove from map. Create node C, add to head.
H
C
A
T
map
1 → A
2 → B
3 → C
🎯

Sentinel nodes matter. Using dummy HEAD and TAIL sentinel nodes eliminates all null-checking edge cases. Every real node is always between two other nodes, so remove and insert operations never need special-case logic for empty lists or boundary nodes.

Step 04

Variant 2 — Stack Augmentation (Min Stack)

Stack + Metadata

The Min Stack supports push, pop, top, and getMin all in O(1). The key insight: maintain a second stack (or store pairs) that tracks the running minimum at each level. When we push, we also push the new min. When we pop, we also pop the min.

Example: push(5), push(3), push(7), pop(), getMin()

Two stacks side by side: the main stack and the min stack.

Main Stack
5
Min Stack
5
push(5): min = 5
Main Stack
5
3
Min Stack
5
3
push(3): min = min(3, 5) = 3
Main Stack
5
3
7
Min Stack
5
3
3
push(7): min = min(7, 3) = 3
Main Stack
5
3
Min Stack
5
3
pop(): removed 7. getMin() = 3
🎯

Choosing the variant: If the problem needs fast lookup + ordering (LRU, LFU, time-based caches), use hash map + linked list. If the problem needs a standard container with an extra aggregate query (getMin, getMax, getRandom), use augmentation—bolting a parallel structure alongside the main one.

Step 05

Template Code

Here are the annotated Java templates for both core variants. Each method is O(1).

LRU Cache (Hash Map + Doubly-Linked List)

class LRUCache {
    private Map<Integer, Node> map = new HashMap<>();
    private Node head, tail;           // sentinel nodes
    private int capacity;

    public LRUCache(int capacity) {
        this.capacity = capacity;
        head = new Node(0, 0);
        tail = new Node(0, 0);
        head.next = tail;              // head ↔ tail
        tail.prev = head;
    }

    public int get(int key) {
        if (!map.containsKey(key)) return -1;
        Node node = map.get(key);
        remove(node);                  // detach from current position
        insertHead(node);              // move to front (most recent)
        return node.val;
    }

    public void put(int key, int val) {
        if (map.containsKey(key)) remove(map.get(key));
        Node node = new Node(key, val);
        map.put(key, node);
        insertHead(node);
        if (map.size() > capacity) {   // over capacity: evict LRU
            Node lru = tail.prev;        // node just before tail sentinel
            remove(lru);
            map.remove(lru.key);
        }
    }

    private void remove(Node n) {     // O(1) unlink
        n.prev.next = n.next;
        n.next.prev = n.prev;
    }

    private void insertHead(Node n) { // O(1) insert after head sentinel
        n.next = head.next;
        n.prev = head;
        head.next.prev = n;
        head.next = n;
    }
}

Min Stack (Two-Stack Approach)

class MinStack {
    private Deque<Integer> stack = new ArrayDeque<>();
    private Deque<Integer> minStack = new ArrayDeque<>();

    public void push(int val) {
        stack.push(val);
        int curMin = minStack.isEmpty()
            ? val
            : Math.min(val, minStack.peek());
        minStack.push(curMin);          // track min at this level
    }

    public void pop() {
        stack.pop();
        minStack.pop();                 // always pop in sync
    }

    public int top()    { return stack.peek(); }
    public int getMin() { return minStack.peek(); } // O(1)
}
📝

Sentinel nodes eliminate edge cases. In the LRU Cache, the HEAD and TAIL sentinels mean the list is never truly empty. You never check if (head == null). The remove and insertHead helpers each have exactly 4 pointer assignments—no branches, no special cases.

Step 06

Visual Walkthrough

Let us trace through a full LRU Cache scenario step by step, showing the hash map and linked list side by side at every stage.

LRU Cache: capacity = 3

Operations: put(1,10), put(2,20), put(3,30), get(2), put(4,40)

OPERATION ACTION LIST (HEAD→TAIL) MAP SIZE
put(1,10) Insert at head [1:10] 1/3
put(2,20) Insert at head [2:20] ↔ [1:10] 2/3
put(3,30) Insert at head [3:30] ↔ [2:20] ↔ [1:10] 3/3 (full)
get(2) Move to headreturns 20 [2:20] ↔ [3:30] ↔ [1:10] 3/3
put(4,40) Evict key 1, insert 4 at head [4:40] ↔ [2:20] ↔ [3:30] 3/3
Key 1 was evicted because after get(2) moved key 2 to the front, key 1 became the least recently used node (closest to TAIL).
💡

Notice the pattern: Every get and put touches exactly one node in the hash map (O(1) lookup) and performs at most two linked-list operations (remove + insertHead, both O(1) with direct node references). No iteration is ever needed.

Step 07

Common Problems

These are the classic LeetCode problems that use the data structure design pattern, listed in rough order of difficulty.

#155 Min Stack Medium
#705 Design HashSet Easy
#706 Design HashMap Easy
#146 LRU Cache Medium
#380 Insert Delete GetRandom O(1) Medium
#460 LFU Cache Hard
💡

Practice tip: Start with #155 Min Stack (direct two-stack pattern) and #705/#706 (understand hash internals). Then tackle #146 LRU Cache (the flagship composite design problem). Once comfortable, try #380 (array + map for O(1) random) and #460 LFU Cache (the ultimate multi-structure composite).

Step 08

Interactive Demo

Set a capacity and perform get/put operations on an LRU Cache. Watch the hash map and doubly-linked list update in real time. See eviction happen when the cache exceeds capacity.

⚙ LRU Cache Visualizer




LINKED LIST (MRU → LRU)
HEAD
TAIL
HASH MAP
(empty)
STATS
Capacity:3
Size:0
Use "Put" and "Get" buttons, or press "Run Example" to see a preset sequence.
Step 09

Complexity Analysis

Time (per op)
O(1)
Space
O(capacity)

Why O(1) Per Operation?

LRU Cache: Hash map lookup is O(1). Linked list remove/insert are O(1) because we have a direct pointer to the node (from the map). No traversal is ever needed. Eviction removes the tail's predecessor in O(1).

Min Stack: Each push/pop does exactly two stack operations (one on the main stack, one on the min stack). getMin() just peeks the min stack. All O(1).

Space: Both structures store at most n elements (where n is the capacity for caches, or the number of elements for stacks). The hash map and linked list share the same nodes (not duplicate copies), so space is O(n).

The space-time tradeoff: We use extra space (a second stack or a hash map) specifically to avoid iteration. Every design pattern problem is fundamentally about trading space for time, buying O(1) operations by storing redundant pointers or metadata.

Step 10

Key Takeaways

01

Combine two structures for O(1) everything. No single data structure gives you fast lookup, fast ordering, and fast insert/delete. Composites like hash map + linked list achieve what no solo structure can.

02

Use sentinel nodes to eliminate edge cases. Dummy head and tail nodes in a doubly-linked list mean the list is never truly empty. Remove and insert operations become simple, branchless 4-pointer reassignments.

03

Augmentation means "bolt on a parallel tracker." For Min Stack, the min stack mirrors the main stack. For Max Stack, track the max. The pattern generalizes: any aggregate query can be answered in O(1) by maintaining it alongside the primary structure.

04

Store direct node references in the map. The hash map must point directly to the linked list node (not just the key or value). This is what enables O(1) remove: you skip the traversal and go straight to the node to unlink it.

05

Identify the operations, then choose the composite. List every required operation and its time constraint. Then ask: "Which primitive structures, when layered together, satisfy all constraints?" LRU = map + list. RandomizedSet = map + array. MinStack = stack + stack. The design follows directly from the requirements.