Sorting Algorithm Visualizer -- Watch 10 Algorithms Sort Step by Step

Animated visualizations of bubble sort, merge sort, quick sort, and more

Sorting Algorithm Visualizer

Select an algorithm, adjust the array size and speed, then hit Play to watch it sort in real time. Each bar represents a value. Colors show what the algorithm is doing at each step.

40
50
Ready -- press Play or Step
Array Visualization
Default Comparing Swapping Pivot Sorted
Pseudocode
Live Statistics
0
Comparisons
0
Swaps
0
Array Accesses
0ms
Elapsed
Complexity
Best --
Average --
Worst --
Space --
Stable --

All 10 Algorithms at a Glance

Quick reference for time complexity, space usage, and when to use each algorithm.

Bubble Sort

Repeatedly swaps adjacent elements that are out of order. Simple but slow. Good only for tiny arrays or nearly sorted data.

O(n^2) avg Stable O(1) space

Selection Sort

Finds the minimum element and places it at the front, then repeats for the rest. Always O(n^2) regardless of input order.

O(n^2) avg Unstable O(1) space

Insertion Sort

Builds a sorted portion one element at a time by inserting each into its correct position. Efficient for small or nearly sorted arrays.

O(n^2) avg Stable O(1) space

Merge Sort

Divides the array in half, recursively sorts each half, then merges them. Guaranteed O(n log n) but requires extra space.

O(n log n) avg Stable O(n) space

Quick Sort

Picks a pivot, partitions elements around it, and recurses on each partition. Fast in practice, but worst case is O(n^2).

O(n log n) avg Unstable O(log n) space

Heap Sort

Builds a max heap, then repeatedly extracts the maximum. Guaranteed O(n log n) and in-place, but not stable.

O(n log n) avg Unstable O(1) space

Shell Sort

Generalization of insertion sort that compares elements far apart, then progressively reduces the gap. Faster than plain insertion sort.

O(n^1.3) avg Unstable O(1) space

Counting Sort

Counts occurrences of each value, then reconstructs the sorted array. Linear time but only works on non-negative integers.

O(n+k) avg Stable O(k) space

Radix Sort

Sorts digit by digit from least significant to most significant using counting sort as a subroutine. Linear for fixed-width integers.

O(d*n) avg Stable O(n+k) space

Cocktail Shaker Sort

Bidirectional bubble sort -- sweeps left-to-right then right-to-left. Slightly better than bubble sort on nearly sorted data.

O(n^2) avg Stable O(1) space

How Sorting Algorithms Work

A sorting algorithm takes a list of elements and rearranges them into a specific order -- usually ascending or descending. The efficiency of a sorting algorithm is measured by its time complexity (how many operations it performs relative to the input size) and space complexity (how much extra memory it needs).

Sorting algorithms are broadly divided into comparison-based sorts (which determine order by comparing pairs of elements) and non-comparison sorts (which exploit the structure of the data, like integer values). Comparison sorts have a theoretical lower bound of O(n log n) average-case time, while non-comparison sorts like counting sort and radix sort can achieve linear O(n) time for specific data types.

Key Properties

  • Stability: A stable sort preserves the original relative order of elements with equal keys. This matters when sorting by multiple criteria.
  • In-place: An in-place sort uses only O(1) extra memory beyond the input array. Bubble sort, selection sort, and heap sort are in-place.
  • Adaptive: An adaptive sort runs faster on partially sorted input. Insertion sort and bubble sort (with early termination) are adaptive.

When to Use Which Algorithm

ScenarioBest AlgorithmWhy
Small arrays (n < 20)Insertion SortLow overhead, simple, fast for small n
Nearly sorted dataInsertion Sort / Bubble SortAdaptive -- runs in O(n) when almost sorted
General purposeQuick Sort / Merge SortO(n log n) average; quicksort is cache-friendly
Guaranteed worst caseMerge Sort / Heap SortAlways O(n log n), no degenerate cases
Memory constrainedHeap SortO(1) extra space, O(n log n) guaranteed
Integer data in known rangeCounting Sort / Radix SortLinear time O(n) for bounded integers
Stability requiredMerge Sort / Insertion SortPreserves equal-element ordering

Complexity Comparison Table

AlgorithmBestAverageWorstSpaceStable
Bubble SortO(n)O(n^2)O(n^2)O(1)Yes
Selection SortO(n^2)O(n^2)O(n^2)O(1)No
Insertion SortO(n)O(n^2)O(n^2)O(1)Yes
Merge SortO(n log n)O(n log n)O(n log n)O(n)Yes
Quick SortO(n log n)O(n log n)O(n^2)O(log n)No
Heap SortO(n log n)O(n log n)O(n log n)O(1)No
Shell SortO(n log n)O(n^1.3)O(n^2)O(1)No
Counting SortO(n+k)O(n+k)O(n+k)O(k)Yes
Radix SortO(d*n)O(d*n)O(d*n)O(n+k)Yes
Cocktail ShakerO(n)O(n^2)O(n^2)O(1)Yes

n = number of elements. k = range of input values. d = number of digits in the largest number.

Frequently Asked Questions

What sorting algorithms does this visualizer support?

This visualizer supports 10 algorithms: Bubble Sort, Selection Sort, Insertion Sort, Merge Sort, Quick Sort, Heap Sort, Shell Sort, Counting Sort, Radix Sort, and Cocktail Shaker Sort. Each animates in real time with color-coded comparisons and swaps.

What is the fastest sorting algorithm?

For general-purpose sorting, Quick Sort is typically fastest in practice due to cache efficiency, averaging O(n log n). Merge Sort and Heap Sort also guarantee O(n log n) worst-case. For integer data in a known range, Counting Sort and Radix Sort achieve O(n) linear time.

What is the difference between stable and unstable sorting?

A stable sort preserves the relative order of elements with equal keys. For example, if two records have the same value, a stable sort keeps them in their original order. Merge Sort, Insertion Sort, Bubble Sort, and Counting Sort are stable. Quick Sort, Heap Sort, and Selection Sort are not.

Why is Bubble Sort taught if it is inefficient?

Bubble Sort is the simplest sorting algorithm to understand and implement. It introduces fundamental concepts like comparisons, swaps, and nested loops. While it is too slow for real-world use on large datasets, it is an effective teaching tool for understanding algorithmic thinking.

How does the pivot choice affect Quick Sort?

A poor pivot choice (such as always picking the smallest or largest element) causes Quick Sort to degrade to O(n^2). Strategies like median-of-three or random pivot selection help avoid this. This visualizer uses the last element as the pivot for clarity.

Does this tool store any data?

No. All visualizations run entirely in your browser using JavaScript. No data is sent to any server, and nothing is stored.

Related Tools

Privacy

This tool runs entirely in your browser. No array data or interaction history is sent to any server. Nothing is stored.

Related Tools

View all tools

Sorting Algorithm Visualizer FAQ

What is Sorting Algorithm Visualizer?

Sorting Algorithm Visualizer is a free education tool that helps you Watch 10 sorting algorithms animate step by step with pseudocode, comparisons, and swaps.

How do I use Sorting Algorithm Visualizer?

Enter your input values, review the calculated output, and adjust inputs until you reach the result you need. The result updates in your browser.

Is Sorting Algorithm Visualizer private?

Yes. Calculations run locally in your browser. Inputs are not uploaded to a server by default, and refreshing the page clears session data.

Does Sorting Algorithm Visualizer require an account or installation?

No. You can use this tool directly in your browser without sign-up or software installation.

How accurate are results from Sorting Algorithm Visualizer?

This tool applies standard formulas or deterministic processing logic for estimates. For medical, legal, tax, or investment decisions, verify with a qualified professional.

Can I save or share outputs from Sorting Algorithm Visualizer?

You can bookmark this page and copy outputs manually. Results are not persisted in your account and are typically not embedded in the URL.

Request a New Tool
Improve This Tool