Wednesday, 1 August 2018

Data Structure Sorting Algorithms



Algorithm
Best-case
Worst-case
Average-case
Space Complexity
Stable?
𝛰(𝓷  log 𝓷)
𝛰(𝓷  log 𝓷)
𝛰(𝓷  log 𝓷)
𝛰(𝓷)
Yes
𝛰(𝓷 )
𝛰(𝓷 ²)
𝛰(𝓷 ²)
𝛰(1)
Yes
𝛰(𝓷 )
𝛰(𝓷 ²)
𝛰(𝓷 ²)
𝛰(1)
Yes
𝛰(𝓷  log 𝓷)
O(n²)
𝛰(𝓷  log 𝓷)
log 𝓷  best, 𝓷  avg
Usually not*
𝛰(𝓷  log 𝓷)
𝛰(𝓷  log 𝓷)
𝛰(𝓷  log 𝓷)
𝛰(1)
No
O(k + n)
O(k + n)
O(k + n)
O(k + n)
Yes


Comparison Sorts
Comparison sorts compare elements at each step of the algorithm to determine if one element should be to the left or right of another element.
Comparison sorts are usually more straightforward to implement than integer sorts, but comparison sorts are limited by a lower bound of 𝛰(𝓷log𝓷), meaning that, on average, comparison sorts cannot be faster than 𝛰(𝓷log𝓷). A lower bound for an algorithm is the worst- case running time of the best possible algorithm for a given problem. The "on average" part here is important: there are many algorithms that run in very fast time if the inputted list is already sorted, or has some very particular (and overall unlikely) property.
There is only one permutation of a list that is sorted, but n! possible lists, the chances that the input is already sorted is very unlikely, and on average, the list will not be very sorted.


Integer Sorts
Integer sorts are sometimes called counting sorts (though there is a specific integer sort algorithm called counting sort). Integer sorts do not make comparisons, so they are not bounded by 𝛀(𝓷log 𝓷). Integer sorts determine for each element 𝑥 how many elements are less than 𝑥. If there are 14 elements that are less than  𝑥, then will be placed in the 15th slot. This information is used to place each element into the correct slot immediately — no need to rearrange lists.


A time complexity of 𝛀(𝓷log 𝓷), meaning the algorithm can't be faster than 𝓷  log 𝓷.
However, usually, the running time of algorithms is discussed​ in terms of Big O, and not Omega.
For example, if an algorithm had a worst-case running time of 𝛰(𝓷log 𝓷), then it is guaranteed that the algorithm will never be slower than 𝛰(𝓷log 𝓷), and if an algorithm has an average-case running time of 𝛰(𝓷²), then on average, it will not be slower than  𝛰(𝓷²).

The running time describes how many operations an algorithm must carry out before it completes. The space complexity describes how much space must be allocated to run a particular algorithm. For example, if an algorithm takes in a list of size , and for some reason makes a new list of size for each element in , the algorithm needs space.


Read more about sorting algorithms