11

Click here to load reader

Sorting Efficiency - Advance Programming Techniques - M70CDE

Embed Size (px)

DESCRIPTION

This coursework will evaluate the efficiency of three sorting algorithms: Snail-Sort, Selection-Sort and Quick-Sort. To evaluating these three sorting algorithms, the number of comparisons will be counted. Therefore, by running three comparisons on each list size (10, 20, 40, 80, 160, 320); the average number of three comparisons will be counted. Furthermore, using MS Excel to produce the graph shows the differences between three sorting algorithms in terms of efficiency as well as defining the Big-O formula and if it is confirmed by this test result.

Citation preview

Page 1: Sorting Efficiency - Advance Programming Techniques - M70CDE

1 2011 M70_CW2_ By: Saud Aljaloud

Faculty of Engineering and Computing

M70CDE Module

(Advance Programming Techniques)

Diay2 Sorting Efficiency

2011 Coursework 2

By: Saud Aljaloud

Page 2: Sorting Efficiency - Advance Programming Techniques - M70CDE

2 2011 M70_CW2_ By: Saud Aljaloud

WARING: IF YOU WANT GET FAIL IN M70CDE Module

COPY THIS DOCUENT, SO BE CAERFULE

I got 80 out of 100 on this document but

What I mean try to avoid plagiarism as you can

CONTENTS Introduction……...................……………………………………………….…….……………..........3

1.1. What are three sorting algorithms? …………...................…...……….…………….............3 1.1.1. Snail-Sort……………………………………………………………………………………….3 1.1.2. Selection-Sort……………………………………………………………………………….….4 1.1.3. Quick-Sort……………………………………………………………………….......................4

1.2. Comparing the three sorting algorithm ……………………………………………………..5 1.2.1. Snail-Sort ………………………………………………………………………………………5 1.2.2. Selection-Sort…………………………………………………………………………………..6 1.2.3. Quick-Sort………………………………………………………………….…………………...7

1.2.4. The efficiency of the three sorting algorithms.............................................................................8

2.2. The evaluation of listInSort ……………………………………………………………………...9

2.3. Big-O notation ...............................................................................................................................10

Figures:

-Figure1: Show how Snail-Sort works………………………….……………………………………… 3

-Figure2: Shows how the selection-Sort works………………………………………………..………..4

-Figure3: shows how Quick-Sort works………………………………………………………………...4

-Figure4: The six tests of the snail_sort....................................................................................................5

Hey!! Avoid plagiarism. DO NOT COPY PASTE but YOU CAN REWRITE!!

FREE Tip: There is a brilliant website to check the percentage of plagiarism

www.writecheck.com

Page 3: Sorting Efficiency - Advance Programming Techniques - M70CDE

3 2011 M70_CW2_ By: Saud Aljaloud

-Figure5: growth rate of snail sort algorithm …………………………………………………………...5

-Figure6: The six tests of the selection sort..............................................................................................6

-Figure7: growth rate of selection sort algorithm……………………………………………………….6

-Figure8: The six tests of the quick sort……………………………………………………………...…7

-Figure9: growth rate of quick sort algorithm ………………………………………………………….7

-Figure10: Table shows the number of comparisons……………………………………………………8

-Figure11: Line graph for the three sorting algorithms showing the growth-rate………………………8

-Figure12: Table shows the number of comparisons for listInSort……………………………………..9

-Figure11: Line graph for listInSort algorithms showing the growth-rate……………………………...9

-Figure14: Table shows the algorithms in Big-O notation…………………………………………….10

-Figure15: The second table shows the algorithm after applying calculations………………………..10

-Figure16: the graph shows the Big-O notation growth-rate…………………………………………..10

Introduction

This coursework will evaluate the efficiency of three sorting algorithms: Snail-Sort, Selection-Sort and Quick-Sort. To evaluating these three sorting algorithms, the number of comparisons will be counted. Therefore, by running three comparisons on each list size (10, 20, 40, 80, 160, 320); the average number of three comparisons will be counted. Furthermore, using MS Excel to produce the graph shows the differences between three sorting algorithms in terms of efficiency as well as defining the Big-O formula and if it is confirmed by this test result.

1.3. What are three sorting algorithms? 1.3.1. Snail-Sort

It is a simple sorting algorithm and seems to be like Bubble Sort. The average case and worst case for this algorithm are O(𝑛2). This algorithm works by comparing the first element within the second, and if the first is greater than the second, then swap them with each other, else leave them in the same position. Next, compare the first with the rest of elements if there is one lesser than it, then swap them. Repeat this for the second element then third element till last one.

Page 4: Sorting Efficiency - Advance Programming Techniques - M70CDE

4 2011 M70_CW2_ By: Saud Aljaloud

Figure1: Show how Snail-Sort works.

1.3.2. Selection-Sort It is an in-place comparison sort and has the same O(𝑛2) complexity in worst case, average case and best case. It works by finding the minimum element and swaps it with the first position, then start from next position applying the same as above for remaining elements in the list. As a result, the running time for selection: O(n + (n-1) + (n-2) + … + 1) = O(𝑛2).

Figure2: Shows how the selection-Sort works.

1.3.3. Quick-Sort

It is a divide and conquer algorithm and the main idea is to partition. However, the list will be divided into three parts: The first part has a single element in its sorted position

Page 5: Sorting Efficiency - Advance Programming Techniques - M70CDE

5 2011 M70_CW2_ By: Saud Aljaloud

called a pivot. The second part is a sub-list that its elements are smaller than the pivot. The third part is a sub-list which its elements are larger than the pivot. Applying this idea to the smaller and to the larger sub-lists, the list will be sorted.

On average case, quick-sort requires O(𝑛 𝑙𝑜𝑔 𝑛) whereas in the worst case it requires O(𝑛2). However, It can be implemented as in-place sort requiring just O(𝑙𝑜𝑔 𝑛).

Figure3: shows how Quick-Sort works.

1.4. Comparing the three sorting algorithm

The coursework is required to test those sorting algorithms by running three comparisons tests within different amounts of data. So, here is the result of this test:

1.4.1. Snail-Sort:

After running six tests and counting the average of those number of comparisons, it is noticeable that a snail-sort is the worst sorting algorithm in terms of large data, it requires O(𝑛2). However, between the numbers 10 to 40 of data, it is mostly stable while the amount of data is increased; the number of comparisons is also increased significantly from 80 to 320. As a result, the number of comparisons depends on the amount of data.

N SnailSort Average Test1 Test2 Test3 Test4 Test5 Test6

10 87 132 107 72 86 128 102 20 628 806 757 661 766 566 697 40 3178 4674 5259 4846 4950 3627 4422 80 29960 31408 32656 32139 29642 38664 32411

160 239124 232885 216207 253641 250611 203828 232716 320 1849343 1918710 1719916 1771351 1995151 1867102 1853595

Figure4: The six tests of the snail_sort.

Page 6: Sorting Efficiency - Advance Programming Techniques - M70CDE

6 2011 M70_CW2_ By: Saud Aljaloud

Figure5: growth rate of snail sort algorithm

1.4.2. Selection-Sort:

The selection-sort is better than snail-sort as the graph shown that, but it requires O(𝑛2) in all cases( best, average or worst). However, it is simplistic and easy to implement as well as bubble-sort. By looking at the graph, between 10 to more than 80 which the amounts of data, the selection-sort is stable. However, by 160 t0 320 it is increased gradually as data increased. Therefore, it is also depend on n (data size).

N SelectionSort Average Test1 Test2 Test3 Test4 Test5 Test6

10 45 45 45 45 45 45 45 20 190 190 190 190 190 190 190 40 780 780 780 780 780 780 780 80 3160 3160 3160 3160 3160 3160 3160

160 12720 12720 12720 12720 12720 12720 12720 320 51040 51040 51040 51040 51040 51040 51040

Figure6: The six tests of the selection sort.

0

500000

1000000

1500000

2000000

۱۰ ۲۰ ٤۰ ۸۰ ۱٦۰ ۳۲۰

SnailSort

SnailSort

Page 7: Sorting Efficiency - Advance Programming Techniques - M70CDE

7 2011 M70_CW2_ By: Saud Aljaloud

Figure7: growth rate of selection sort algorithm

1.4.3. Quick-Sort:

The quick-sort algorithm is a quick sort and it requires O(𝑛 log𝑛) in the best case as well as an average case, whereas 𝑛2 in the worst case but it usually runs faster than other sorting algorithms. Quick-Sort is an efficient within a large list and it does not require a big memory.

N QuickSort Average Test1 Test2 Test3 Test4 Test5 Test6

10 22 29 31 36 21 26 27 20 94 64 89 87 69 61 77 40 199 185 177 181 208 183 188 80 468 516 455 508 455 434 472

160 1262 1095 1251 1184 1294 1154 1206 320 2751 3035 2625 2567 3151 2781 2818

Figure8: The six tests of the quick sort.

0

10000

20000

30000

40000

50000

60000

۱۰ ۲۰ ٤۰ ۸۰ ۱٦۰ ۳۲۰

SelectionSort

SelectionSort

Page 8: Sorting Efficiency - Advance Programming Techniques - M70CDE

8 2011 M70_CW2_ By: Saud Aljaloud

Figure9: growth rate of quick sort algorithm

1.4.4. The efficiency of the three sorting algorithms:

By looking at both the table and the graph below, Snail-Sort is the less efficient than selection and quick sort whereas the best one is quick sort. However, with the data size is 10, snail sort requires 102 comparisons, while selection sort 45 and quick sort 27 comparisons.

N SnailSort SelectionSort QuickSort 10 102 45 27 20 697 190 77 40 4422 780 188 80 32411 3160 472

160 232716 12720 1206 320 1853596 51040 2818

Figure10: Table shows the number of comparisons.

0

500

1000

1500

2000

2500

3000

۱۰ ۲۰ ٤۰ ۸۰ ۱٦۰ ۳۲۰

QuickSort

QuickSort

Page 9: Sorting Efficiency - Advance Programming Techniques - M70CDE

9 2011 M70_CW2_ By: Saud Aljaloud

Figure11: Line graph for the three sorting algorithms showing the growth-rate

2.2. The evaluation of listInSort :

The listInSort which is similar to insertion sort works as the example below:

Assume the Array is empty and we want to add {5, 3, 2, 8, 1} into the array in ascending order, so the array will be sorted.

1- Add the first element which 5 into the array = {5}. 2- Then before adding the next element 3 compare it with the first element which is 5, if it

is smaller than, swap it with 5, if it is greater than the previous element. Then add it to the end of the array. So the array now is {3, 5}.

3- Repeating the step above with the rest of the array elements till array becomes = {1, 2, 3, 5, 8}. From the example above, on the first operation, it compares a maximum of one element. Then on the second, it compares a maximum of two elements, and three elements, until the maximum (N-1) comparisons on the last operation. So, it is 1 + 2 + … + N-1 = N*(N-1)/2. But the quick sort is the best sorting algorithm comparing with the other algorithms.

N ListInSort 10 32

0 100000 200000 300000 400000 500000 600000 700000 800000 900000

1000000 1100000 1200000 1300000 1400000 1500000 1600000 1700000 1800000 1900000 2000000

۱۰ ۲۰ ٤۰ ۸۰ ۱٦۰ ۳۲۰

SnailSort

SelectionSort

QuickSort

Page 10: Sorting Efficiency - Advance Programming Techniques - M70CDE

10 2011 M70_CW2_ By: Saud Aljaloud

20 117 40 430 80 1684

160 6744 320 26125

Figure12: Table shows the number of comparisons for listInSort.

Figure13: Line graph for listInSort algorithms showing the growth-rate

2.3. Big-O notation

Big O notation is used in Computer Science to describe the performance or complexity of an algorithm. Big O specifically describes the worst-case scenario, and can be used to describe the execution time required or the space used (e.g. in memory or on disk) by an algorithm.(A Beginners’ Guide to Big O Notation, Rob Bill,2009)

Here are algorithms that are used within those sorting algorithms and a line graph which is illustrated the growth-rate. We can see below that: n2 grows faster than nlog2n and nlog2n grows faster than n and n grows faster than log2n.

Name Best Average Worst Snail-Sort n 𝑛2 𝑛2

Selection-Sort 𝑛2 𝑛2 𝑛2 Quick-Sort n log𝑛 n log𝑛 𝑛2

Figure14: Table shows the algorithms in Big-O notation.

n O(log N) O(N) O(N log N) O(𝒏𝟐)

10 1.00 10 10.00 100

0

5000

10000

15000

20000

25000

30000

۱۰ ۲۰ ٤۰ ۸۰ ۱٦۰ ۳۲۰

ListInSort

ListInSort

Page 11: Sorting Efficiency - Advance Programming Techniques - M70CDE

11 2011 M70_CW2_ By: Saud Aljaloud

20 1.30 20 26.02 400 40 1.60 40 64.08 1600 80 1.90 80 152.25 6400 160 2.20 160 352.66 25600 320 2.51 320 801.65 102400

Figure15: The second table shows the algorithm after applying calculations.

Figure16: the graph shows the Big-O notation growth-rate.

0 10000 20000 30000 40000 50000 60000 70000 80000 90000

100000 110000 120000

1 2 3 4 5 6

O(n²)

O(N log N)

O(N)

O(log N)

n