Comparing efficiency
Hopefully you saw in the last section that timing an algorithm is a good way to test efficiency. The results showed bubble sort to be quite inefficient, insertion a bit more so, but merge sort took the top prize. In most cases, merge sort will sort a list much quicker than the other two. The only exception is when a list is already sorted or nearly sorted, in which case insertion would be the quickest, but even bubble sort would work faster than the merge sort algorithm. Both bubble and insertion sort have in-built ways to reduce their comparisons if a list is already sorted, unlike merge sort which will perform the same operations regardless of the state of the list.
Computer scientists and mathematicians have developed a way of describing the efficiency of an algorithm. This is called Big O notation. This is a fairly advanced topic, so I won’t go into too much depth, but the algorithms can be described as follows:
Bubble sort’s efficiency is О(n2)
Insertion sort’s efficiency is also О(n2)
Merge sort’s efficiency is O(n log n)
What does this mean?
For bubble and insertion sort, the time taken to complete a sort is proportional to the square of the size of the list. That means doubling the size of a list will cause the number of operations that need to be performed to quadruple. This is called quadratic time.
For merge sort, the O(n log n) means that it runs in what is called linearithmic time. You can see from the graph below that, although the number of operations does increase as the size of the list to be sorted increases, the rate of increase is significantly smaller than for quadratic time.