Monday, 31 March 2014

Week 11 - Sorting and Efficiency

Sorting algorithms seem to be the go-to example when doing runtime analysis, and talking about a program's efficiency. It makes sense, I suppose, since it gives examples of runtime ranging from nlog(n) to n^2, where n is the length of the list. Of course, there are other types of algorithms that can have all sorts of runtimes, but those are not important.

The most basic sorting algorithms are bubblesort, insertion sort, and selection sort. All three of these algorithms have double-nested lists both of which have a maximum runtime of n steps. This results in a worst-case runtime of approximately n^2 steps. As such, when working with large lists, these encounter issues, but are simple to code and work decently well for (very) small inputs.

Far better sorting algorithms are mergesort and quicksort, which are both recursive algorithms with a worst case performance of nlogn. Well, actually, quicksort can possibly have far worse runtime, but this can be largely solved by randomly selecting pivot points, unless RNG hates you.

No comments:

Post a Comment