07 · Heap & Priority Queue

Binary max-heap · same data, two views (tree above, array below) · plus heapsort vs quicksort.

Heap operations insert · extract-max · O(log n)
size0 max ops since last reset0 ready

Heapsort vs Quicksort

Same starting array (28 random values), two algorithms in their own panels. Both finish in O(n log n) on average — but heapsort gives a worst-case guarantee, while quicksort is faster on typical inputs at the cost of an O(n²) worst case. The "ops" counter is comparison + swap events; it's a fair efficiency proxy that doesn't depend on wall-clock or browser scheduling.

What you're watching

A binary heap is a complete binary tree in which every parent is ≥ both children (max-heap). The clever bit: because the tree is complete, you can store it in an array — node i's children sit at 2i + 1 and 2i + 2, its parent sits at (i − 1) / 2. The two views above are the same data, just drawn differently.

Insert (sift-up) and extract-max (sift-down)

Insert   O(log n)

Append the new value at the end of the array (= bottom-right of the tree). Then "sift up": compare it to its parent; if it's larger, swap; repeat until it's in place or it reaches the root. At most ⌈log₂ n⌉ swaps because each swap moves the value up one level.

Extract-max   O(log n)

The root holds the max. Remove it; move the very last element into the root slot; "sift down": compare to children, swap with the larger one if it's bigger than the parent; repeat until the heap property holds again.

SUBTLE BUT IMPORTANT
Building a heap from scratch (heapify) is O(n), not O(n log n). The bottom-up construction starts at level ⌊n/2⌋ − 1 and sift-downs each non-leaf. Levels with many nodes have low height; levels with high height have few nodes; the sum telescopes. This is the foundation under heapsort's first phase.

Heapsort step-by-step

  1. Heapify the input array in O(n) (the build-heap step you don't see directly — happens before the first visual swap).
  2. Swap arr[0] (the max) with arr[n − 1]. Now the largest is in its final position; the heap excludes that index.
  3. Sift-down arr[0] within the now-smaller heap. This restores the max-heap property and the new root is the next-largest.
  4. Repeat steps 2-3 with progressively smaller heap range. After n − 1 iterations, the array is sorted in place.
WHY HEAPSORT WHEN QUICKSORT EXISTS
Quicksort's average case is faster (better cache behaviour, smaller constant factors). Heapsort's draw is the worst-case guarantee: O(n log n) regardless of input, no adversarial-input pathology. Linux's CFS scheduler, real-time kernels, and several language standard libraries' "fall-back when quicksort goes pathological" branches all use heapsort (or introsort, which is "quicksort that switches to heapsort if recursion goes too deep").

What I learned writing this

← Back to all algorithms