Sorting Algorithms Visualized: From Bubble to Quicksort

simulator beginner ~8 min
Loading simulation...
Bubble sort: ~1225 comparisons for 50 elements

Bubble sort on 50 random elements requires about n²/2 = 1,225 comparisons. Merge sort would need only about 282 (n log₂ n) — more than 4× fewer.

Formula

Bubble/Selection/Insertion worst case: T(n) = O(n²)
Merge/Quick/Heap average case: T(n) = O(n log n)
Lower bound: Ω(n log n) for comparison-based sorting

The Fundamental Problem of Computer Science

Sorting — arranging elements in order — is the most studied problem in computer science. It's estimated that 25% of all computing cycles are spent sorting data. From database queries to rendering graphics, from search engines to DNA sequencing, efficient sorting is essential. The quest to sort faster has produced some of the most elegant algorithms ever devised.

O(n²) vs O(n log n)

The simplest sorting algorithms — bubble sort, selection sort, insertion sort — compare elements pairwise and require O(n²) operations. This means doubling the input quadruples the time. Advanced algorithms like merge sort, quicksort, and heapsort achieve O(n log n) by dividing the problem into subproblems. For 1,000 elements, that's roughly 10,000 vs 1,000,000 operations — a 100× difference.

Watching the Algorithms Work

The visualization shows array elements as vertical bars, with height proportional to value. Red highlights show the elements currently being compared, green shows elements being swapped, and cyan marks the sorted portion. Watch how bubble sort methodically pushes large elements to the right, while quicksort rapidly partitions the array around pivots. The difference in speed is dramatic even at small sizes.

Choosing the Right Algorithm

No single algorithm is best for all situations. Quicksort is fastest on average but degrades to O(n²) on already-sorted data. Merge sort is reliably O(n log n) but requires extra memory. Insertion sort is optimal for small or nearly-sorted arrays. Real-world implementations like Timsort (Python, Java) combine these strategies, using insertion sort for small runs and merge sort to combine them.

FAQ

Why is bubble sort so slow?

Bubble sort compares every pair of adjacent elements and swaps them if they're in the wrong order, repeating until the array is sorted. This requires O(n²) comparisons in the average and worst case. Each pass only guarantees one element reaches its final position.

Which sorting algorithm is fastest?

For general-purpose sorting, quicksort is typically fastest in practice due to good cache performance, with O(n log n) average time. Merge sort guarantees O(n log n) worst-case. For nearly-sorted data, insertion sort can be O(n). The best choice depends on the data characteristics.

What does it mean for a sort to be stable?

A stable sort preserves the relative order of elements with equal values. Merge sort, insertion sort, and bubble sort are stable. Quicksort and heapsort are not stable by default. Stability matters when sorting by multiple criteria sequentially.

Why can't we sort faster than O(n log n)?

Any comparison-based sorting algorithm requires at least O(n log n) comparisons in the worst case. This is because there are n! possible orderings, and each comparison eliminates at most half of them. So we need at least log₂(n!) ≈ n log n comparisons. Non-comparison sorts (radix, counting) can beat this bound.

Sources

Embed

<iframe src="https://homo-deus.com/lab/computer-science/sorting-algorithms/embed" width="100%" height="400" frameborder="0"></iframe>
View source on GitHub