Upload
mrzaggy
View
254
Download
0
Embed Size (px)
Citation preview
8/14/2019 Lecture7 - Sorting Algorithms
1/22
COMP2230 Introduction toAlgorithmics
Lecture 7
A/Prof Ljiljana Brankovic
8/14/2019 Lecture7 - Sorting Algorithms
2/22
Lecture Overview
Sorting AlgorithmsText, Chapter 6
Next week: Greedy Algorithms, Text, Chapter 7
8/14/2019 Lecture7 - Sorting Algorithms
3/22
Quick Sort
Quicksort uses a divide and conquer approach,however is very different to merge sort.
First it picks an element to partition the array on, thenputs all elements smaller than the partition element tothe left of the partition element (but not necessarily insorted order), and all the larger elements to the right(again not necessarily sorted).
Then it recursively sorts the two sections of the array.
8/14/2019 Lecture7 - Sorting Algorithms
4/22
Algorithm 6.2.2 PartitionThis algorithm partitions the array
a[i], ... , a[j]by inserting val = a[i] at the index hwhere it would be if the array was sorted. Whenthe algorithm concludes, values at indexes less than hare less than val, and values atindexes greater than hare greater than or equal to val. The algorithm returns theindex h.
Input Parameters: a,i,j
Output Parameters: i
partition(a,i,j) {
val= a[i]
h= i
for k= i+ 1 to jif (a[k] < val) {
h= h+ 1
swap(a[h],a[k])
}
swap(a[i],a[h])
return h}
8/14/2019 Lecture7 - Sorting Algorithms
5/22
Algorithm 6.2.4 Quicksort
This algorithm sorts the arraya[i], ... , a[j]
by using the partition algorithm (Algorithm 6.2.2).
Input Parameters: a,i,j
Output Parameters: aquicksort(a,i,j) {
if (i< j) {
p=partition(a,i,j)
quicksort(a,i,p- 1)
quicksort(a,p+ 1,j)
}}
8/14/2019 Lecture7 - Sorting Algorithms
6/22
Quick Sort Example
4 10 3 1 5 Val = 4
h k
4 10 3 1 5 Val = 4
h k
4 3 10 1 5 Val = 4
h k
4 3 10 1 5 Val = 4
h k
4 3 1 10 5 Val = 4
h k
4 3 1 10 5 Val = 4
h k
1 3 4 10 5 Val = 4
h k
8/14/2019 Lecture7 - Sorting Algorithms
7/22
Quick Sort
In the worst case, the partition element isplaced at either then end or the start of thearray, in either case, the first time partitionis called, it takes n-1 steps, the second timen-2, the third n-3 etc., giving:
(Again, prove both O and )In reality, quicksort tends to perform much
better than this.
i=1
n-1
i = n(n-1)/2 = (n2)
8/14/2019 Lecture7 - Sorting Algorithms
8/22
Random Quicksort
We can make an improvement to quicksortby randomly choosing the partition value,rather than choosing the first value in thearray.
Of course the worst case time is still T(n2), butwe can expect random quicksort to run in timeT(n logn). The difference is that now therunning time is independent of the order of theinput.
8/14/2019 Lecture7 - Sorting Algorithms
9/22
A Lower Bound for Sorting
For any comparison based sorting algorithm (this is not the onlykind, just the most common and useful) the running time is atleast as large as the number of comparisons that must be doneto sort.
If we consider the decision tree that represents the running of thealgorithm (where each non leaf node is a comparison), then theremust be at least h comparisons, where h is the height of thetree.
As there are n elements in the array, there must also be n! possiblearrangements, thus there are
n! leaves in the decision tree,
corresponding to all sortings.
As a decision tree is a binary tree we have that for a tree with tleaves and height h, lgt =h.
8/14/2019 Lecture7 - Sorting Algorithms
10/22
A Lower Bound for Sorting
Thus we have lg (n !) = h.
We have already shown that lg (n !) = (n log n).
Thus for some c such that c n log n lg (n !).
Therefore c n log n h.
Thus the number of comparisons is lower bounded by n log n.
So we can do no better than this for comparison based sorting.
8/14/2019 Lecture7 - Sorting Algorithms
11/22
Sorting Algorithms
Main algorithmic strategies used for sorting:
Brute Force Selection sort O(n2) Bubble sort O(n2)
Divide-and-Conquer Mergesort O(n log n) Quicksort O(n log n), the worst case O(n2)
Decrease-and-Conquer Insertion sort O(n2)
Transform-and-Conquer Heapsort O(n log n)
Which of these algorithms are stable?
8/14/2019 Lecture7 - Sorting Algorithms
12/22
Sorting Algorithms
Straightforward algorithms are
O(n2)
More complex algorithms are
O(n log n)
Can we do better than that?
8/14/2019 Lecture7 - Sorting Algorithms
13/22
Counting and Radix Sort
Counting sort sorts an array of integers.
It first counts the number of occurrences of
each integer value in the array, and then thenumber of values less than or equal to a givenvalue.
8/14/2019 Lecture7 - Sorting Algorithms
14/22
Example
The number of occurrences of each value k in the array:c[3]=1, c[5]=1, c[7]=1, c[8]=2, c[10]=1
The number of elements less than or equal to k:c[3]=1, c[5]=2, c[7]=3, c[8]=5, c[10]=6
Then we place element 5 in position 2, and decrementc[5]; we place 8 in position 5 and decrement c[8]; and soon. Is this algorithm stable? If not what can we changeso that it becomes stable?
5 8 3 8 10 7
8/14/2019 Lecture7 - Sorting Algorithms
15/22
Example
c[3]=1, c[5]=2, c[7]=3, c[8]=5, c[10]=6
c[3]=1, c[5]=1, c[7]=3, c[8]=5, c[10]=6
c[3]=1, c[5]=1, c[7]=3, c[8]=4, c[10]=6
c[3]=0, c[5]=1, c[7]=3, c[8]=4, c[10]=6
c[3]=0, c[5]=1, c[7]=3, c[8]=3, c[10]=6
c[3]=0, c[5]=1, c[7]=3, c[8]=3, c[10]=5
c[3]=0, c[5]=1, c[7]=2, c[8]=3, c[10]=5
5 8 3 8 10 7
5
5 8
5 83
3 5 8 8
3 5 8 8 10
3 5 8 8 107
l h
8/14/2019 Lecture7 - Sorting Algorithms
16/22
Algorithm 6.4.2 Counting SortThis algorithm sorts the array a[1], ... , a[n] of integers, each in the range 0 to m,inclusive.Input Parameters: a,m
Output Parameters: acounting_sort(a,m) {
// set c[k] = the number of occurrences of value k
// in the array a.
// begin by initializing cto zero.
for k= 0 to m
c[k] = 0
n= a.lastfor i= 1 to n
c[a[i]] = c[a[i]] + 1
// modify cso that c[k] = number of elements k
for k = 1 to m
c[k] = c[k] + c[k- 1]
// sort a with the result in bfor i= ndownto 1 {
b[c[a[i]]] = a[i]
c[a[i]] = c[a[i]] - 1
}
// copy bback to a
for i= 1 to n
a[i] = b[i]}
8/14/2019 Lecture7 - Sorting Algorithms
17/22
Counting Sort
The complexity of counting sort is (n+m), where n isthe number of elements in the array each being in therange 0 to m.
Is counting sort stable?
8/14/2019 Lecture7 - Sorting Algorithms
18/22
Algorithm 6.4.4 Radix Sort
This algorithm sorts the array a[1], ... , a[n] of integers.Each integer has at most kdigits.
Input Parameters: a,k
Output Parameters:aradix_sort(a,k) {
for i= 0 to k- 1
counting_sort(a,10) // key is digit in 10is place
}
8/14/2019 Lecture7 - Sorting Algorithms
19/22
Radix Sort
The complexity of radix sort is (kn) where n is thenumber of integers and k is the max number of digitsof the integers.
Is Radix sort stable?
8/14/2019 Lecture7 - Sorting Algorithms
20/22
Selection
Random Select uses random partition to find the kthsmallest element in an array.
8/14/2019 Lecture7 - Sorting Algorithms
21/22
Algorithm 6.5.2 Random Select
Let val be the value in the array a[i], ... , a[j] that would be at index k(ikj)if the entire array was sorted. This algorithm rearranges the array so that valis at index k, all values at indexes less than kare less than val, and all values atindexes greater than k are greater than or equal to val. The algorithm uses therandom-partition algorithm (Algorithm 6.2.6).
Input Parameters: a,i,j,k
Output Parameter: a
random_select(a,i,j,k) {
if (i< j) {
p= random_partition(a,i,j)
if (k==p)return
if (k
8/14/2019 Lecture7 - Sorting Algorithms
22/22
Complexity of Random Select
Worst-case: (n2)
Average: (n)