Pick any three from eight algorithms. Quick, merge, heap, shell are O(n log n). Insertion, selection, bubble are O(n²). Radix is O(d*n). Click an algorithm name to cycle it.
The O(n log n) algorithms achieve this bound by different strategies: quicksort partitions around a pivot, mergesort divides and conquers with a stable merge step, heapsort maintains a binary heap invariant, and shellsort uses diminishing gap sequences to move elements long distances early. The O(n²) algorithms are simpler but degrade with scale, doing redundant work on every pass.
Radix sort avoids comparisons entirely by distributing elements into buckets digit-by-digit, making it linear in the number of digits times the input size. It only works on integers or fixed-length keys, but when applicable it outperforms comparison-based sorts.
Try reversed input with bubble sort vs quicksort, or nearly-sorted with insertion sort. Nearly-sorted input is a best case for insertion sort (nearly linear) but a worst case for naive quicksort (quadratic without median-of-three pivot selection). Each bar is one element. The counter tracks comparisons and swaps. Same input, different strategies.
Wikipedia