Sorting algorithms aren't usually tested alone. They are often paired with another problem type and the goal is to really see if you're able to think about the algorithms in a wider context
O(n2)
Traverse from left to right, swap element with neighbor is neighbor is less than element
Repeat till no more swaps are needed
Invariant: after every iteration, the largest elements are arranged at the end in ascending order
O(n2)
Traverse from left to right, maintaining a āsortedā section in the front of the array
Every iteration, we seek the next smallest element in the āunsortedā section and insert to the front
Invariant: after iteration j, the front j elements are arranged the smallest elements sorted in ascending order
O(n2)
Traverse from left to right
When encountering a number less than the current element, we move backwards, finding elements before the current element that is greater than the number
Repeat till a smaller number found, and then swap the elements into position
Invariant: after iteration j, the front j elements are arranged in ascending order (no guarantee to be the smallest j
O(nlogn)
T(n)=2T(n/2)+O(n)
Break up the array into half and merge at the end
Stop breaking it up once only 1 element is left
Invariant: the leftmost set of elements will be sorted before the rightmost
# [left, right)defmerge_sort(arr,left,right):if left == right:return arr[left]elif left < right: mid = left + (right - left) //2 left_merged =merge_sort(arr, left, mid)# [left, mid) right_merged =merge_sort(arr, mid, right)# [mid, right)returnmerge(left_merged, right_merged)defmerge(a,b): c = [] i, j =0,0while i <len(a)and j <len(b):if a[i]< b[j]: c.append(a[i]) i +=1else: c.append(b[j]) j +=1 idx, remaining = (i, a) if i <len(a)else (j, b)for k inrange(idx, len(remaining)): c.append(remaining[k])return c
O(nlogn)
Pick a pivot, partition array around pivot value, and repeat for all halves till 1 element