Skip to main content

Linked List Advanced Techniques

TL;DR

Fast-slow pointers detect cycles, find middle elements, and check palindromes efficiently. Partitioning rearranges nodes around pivot values. Reverse subarrays within larger lists. LRU cache combines hash map and doubly-linked list for O(1) access and eviction. These advanced techniques build on basic operations to handle sophisticated problems elegantly.

Core Concepts

Fast-Slow Pointers Advanced Patterns

Find middle element: Slow pointer moves 1 step, fast moves 2. When fast reaches end, slow is at middle. Used for split-the-middle algorithms.

Palindrome check: Find middle, reverse second half, compare. O(n) time, O(1) space (pointer rearrangement).

Cycle detection extended: After detecting cycle with fast-slow meeting, reset slow to head. Move both 1 step; meeting point is cycle start.

Nth from end: Place pointers n apart. Move both until fast reaches end. Slow points to nth from end.

List Partitioning and Rearrangement

Partition by value: Create three lists (< value, == value, > value), then merge. Maintains relative order. O(n) time, O(n) space for three lists.

Odd-even partition: Separate nodes at odd positions from even positions, maintaining order within each group.

Reverse sublist: Reverse only portion between positions m and n. Detach, reverse, reattach.

Palindrome Checking

Algorithm: Find middle, reverse second half, compare with first half. O(n) time, O(1) space.

Restore after check: Can restore by reversing second half again to maintain original list.

LRU Cache Implementation

Data structures: Hash map for O(1) lookup by key, doubly-linked list for O(1) insertion/deletion and maintaining order.

Operations:

  • Get: Hash map lookup O(1), move to front of list O(1)
  • Put: Hash map update O(1), move to front O(1), evict LRU O(1)

Key insight: Combination enables all operations in O(1) vs hash map alone which can't maintain LRU order efficiently.

Key Algorithms

Middle Finder

Fast-slow pointers. O(n) time, O(1) space.

List Palindrome Checker

Find middle, reverse second half, compare. Restore if needed.

LRU Cache

Hash map + doubly-linked list. All operations O(1).

Partition List

Create three lists by value ranges. O(n) time, O(n) space.

Detailed Technique Explanations

Fast-Slow Pointers in Depth

The fast-slow pointer technique is elegant and applies to multiple problems. The core idea: move one pointer twice as fast as another. They'll "collide" at meaningful points.

Finding the Middle:

  • If list has odd length (1,3,5,7), slow stops at center: [1,2,3,4,5]
  • If list has even length (2,4,6), slow stops at first of second half: [1,2,3,4]
  • Use case: split list for merge sort

Cycle Detection:

  • Start both at head
  • Fast moves 2 steps, slow moves 1
  • If they meet, cycle exists
  • To find cycle start: reset slow to head, move both 1 step; meeting point is cycle start
  • Why? The math works out (distance from head to cycle start equals distance from meeting point to cycle start)

Palindrome Checking:

  • Find middle
  • Reverse second half in-place
  • Compare first half with reversed second half
  • Space complexity: O(1) if you allow modifying the list; O(n) if you restore afterward

Partitioning Patterns

Classic Partition (like Quicksort): Rearrange so elements less than x come before elements >= x.

Input: 3 -> 5 -> 8 -> 5 -> 10 -> 2 -> 1 (partition by x=5)
Output: 3 -> 2 -> 1 -> 5 -> 8 -> 5 -> 10
(less than 5) -> (greater than or equal)

Implementation: Create two lists, merge them.

LRU Cache Deep Dive

LRU (Least Recently Used) cache is a perfect interview problem combining multiple data structures.

Core Challenge: Every operation must be O(1)

  • Get: find entry (O(1)) and mark as recently used (O(1))
  • Put: add entry (O(1)), remove oldest if full (O(1))

The Trick: Hash map for O(1) lookup, doubly-linked list for O(1) reordering

  • Get(key): Hash map lookup finds node, move node to front (O(1) with doubly-linked list)
  • Put(key, val): Create new node, add to front, update hash map, evict LRU from tail if needed

Practical Example

class ListNode:
def __init__(self, val=0, next=None):
self.val = val
self.next = next

# Find middle element
def find_middle(head):
slow = fast = head
while fast and fast.next:
slow = slow.next
fast = fast.next.next
return slow

# Check if palindrome
def is_palindrome(head):
if not head or not head.next:
return True

# Find middle
slow = fast = head
while fast and fast.next:
slow = slow.next
fast = fast.next.next

# Reverse second half
prev = None
while slow:
next_temp = slow.next
slow.next = prev
prev = slow
slow = next_temp

# Compare
left, right = head, prev
while right:
if left.val != right.val:
return False
left = left.next
right = right.next
return True

# Partition list by value
def partition(head, x):
less_dummy = ListNode(0)
greater_dummy = ListNode(0)
less = less_dummy
greater = greater_dummy

while head:
if head.val < x:
less.next = head
less = less.next
else:
greater.next = head
greater = greater.next
head = head.next

greater.next = None
less.next = greater_dummy.next
return less_dummy.next

# LRU Cache
class LRUCache:
def __init__(self, capacity):
self.cache = {}
self.capacity = capacity
self.head = ListNode(0)
self.tail = ListNode(0)
self.head.next = self.tail
self.tail.prev = self.head

def _remove(self, node):
prev = node.prev
next_node = node.next
prev.next = next_node
next_node.prev = prev

def _add_to_front(self, node):
node.next = self.head.next
node.prev = self.head
self.head.next.prev = node
self.head.next = node

def get(self, key):
if key not in self.cache:
return -1
node = self.cache[key]
self._remove(node)
self._add_to_front(node)
return node.val

def put(self, key, value):
if key in self.cache:
self._remove(self.cache[key])
elif len(self.cache) == self.capacity:
lru = self.tail.prev
self._remove(lru)
del self.cache[lru.key]

node = ListNode(value)
node.key = key
self.cache[key] = node
self._add_to_front(node)

lru = LRUCache(2)
lru.put(1, 1)
lru.put(2, 2)
print(lru.get(1)) # 1

Common Pitfalls

warning

Not handling odd/even list lengths: Middle finder behaves differently. For even length, middle is first of second half. Handle both cases.

Modifying list structure without cleanup: Partition creates temporary nodes. Ensure proper connections and null termination.

LRU capacity off-by-one: Evict when size == capacity BEFORE inserting, not after. Otherwise exceeds capacity.

Not restoring list after palindrome check: List is reversed after check. Restore if caller expects original structure.

Missing pointer updates: Any rearrangement requires updating multiple pointers. One missing update breaks structure.

Interview Patterns and Variations

Pattern: Dummy Node

Always use a dummy node pointing to the head for partition and reversal operations. This simplifies edge cases:

dummy = ListNode(0)
dummy.next = head
# Now you can safely manipulate dummy.next without special-casing the head

Pattern: Two-Pass Algorithms

Many linked list problems require two passes:

  1. First pass: gather information (size, middle, cycle info)
  2. Second pass: manipulate based on information

Example: Partition requires iterating twice (once to separate, once to merge).

Variation: Reverse Within Range

Instead of reversing the entire list, reverse only nodes between positions m and n.

Original: 1 -> 2 -> 3 -> 4 -> 5
Reverse(2, 4): 1 -> 4 -> 3 -> 2 -> 5
(reverse positions 2-4)

Approach:

  1. Find node before position m
  2. Reverse m to n
  3. Connect back to original list

Advanced Applications

Skip List (Fast Search in Sorted Linked List)

Standard linked list: O(n) search. Skip list adds "express lanes":

Level 2:  1 --------> 5 --------> 9 (fewer nodes)
Level 1: 1 -> 2 -> 3 -> 4 -> 5 -> 6 -> 7 -> 8 -> 9 (all nodes)

Search 7: Jump to 5 (level 2), then to 6,7 (level 1). Total: 3 hops vs 7 in standard list.

LRU Cache with Doubly-Linked List

The "move to front" operation requires removing from middle and adding to front. Only doubly-linked list does this in O(1).

class DNode:
def __init__(self, key, val):
self.key = key
self.val = val
self.prev = None
self.next = None

def _remove(self, node):
# O(1) removal requires previous pointer
node.prev.next = node.next
node.next.prev = node.prev

def _add_to_front(self, node):
# O(1) insertion at front requires head reference
node.next = self.head.next
self.head.next.prev = node
self.head.next = node
node.prev = self.head

Self-Check

  1. How does LRU cache achieve O(1) for all operations? What would break if only hash map used?

    • Answer: Hash map finds entries fast, but doesn't maintain order. Doubly-linked list maintains LRU order and allows O(1) reordering.
  2. Explain why finding middle with fast-slow pointers works. What's the meeting point for even/odd lengths?

    • Answer: Fast pointer moves 2x speed of slow. When fast reaches end, slow is halfway. For odd: dead center. For even: first of second half.
  3. Why does palindrome check require reversal? How would you check without modifying list?

    • Answer: Reversal enables two-pointer comparison from both ends. Without modifying: recursion with stack (compare during unwind) or space to store first half.
  4. What's the space complexity of LRU cache Get vs Put operations?

    • Answer: Both O(1) space—just hash map lookups and pointer manipulations. No temporary data structures.
  5. How would you detect the start of a cycle in O(n) time and O(1) space?

    • Answer: After meeting point found, reset slow to head, move both 1 step. They'll meet at cycle start.

One Takeaway

info

Advanced linked list techniques combine basic operations (traversal, reversal, partition) with clever pointer patterns. Fast-slow pointers solve multiple problems. LRU cache demonstrates how combining data structures (hash map + doubly-linked list) enables efficient solutions. Master the foundation, then recognize when to combine techniques creatively.

References

  • Cormen, T. H., Leiserson, C. E., Rivest, R. L., & Stein, C. (2009). Introduction to Algorithms (3rd ed.). MIT Press.
  • McDowell, G. L. (2015). Cracking the Coding Interview (6th ed.). CareerCup.