Monday 31 March 2014

Week 11 - Sorting and Efficiency

Sorting algorithms seem to be the go-to example when doing runtime analysis, and talking about a program's efficiency. It makes sense, I suppose, since it gives examples of runtime ranging from nlog(n) to n^2, where n is the length of the list. Of course, there are other types of algorithms that can have all sorts of runtimes, but those are not important.

The most basic sorting algorithms are bubblesort, insertion sort, and selection sort. All three of these algorithms have double-nested lists both of which have a maximum runtime of n steps. This results in a worst-case runtime of approximately n^2 steps. As such, when working with large lists, these encounter issues, but are simple to code and work decently well for (very) small inputs.

Far better sorting algorithms are mergesort and quicksort, which are both recursive algorithms with a worst case performance of nlogn. Well, actually, quicksort can possibly have far worse runtime, but this can be largely solved by randomly selecting pivot points, unless RNG hates you.

Sunday 23 March 2014

Week 10 - Near the end

One of the more difficult things in programing appears to be finding good test cases for one's functions. Good testing is, of course, essential for ensuring that one's program functions properly and behaves the way that it should. The issue then, deciding how to test it. The basic ideas are rather simple. Test cases should cover all the different types of inputs or situations. Yet like many things, this is far easier said than done. There are situations and inputs that one may not have thought of, or it may be such that a certain case may be difficult to actually test. All of this makes testing sound far easier than it actually is.

Also, couldn't quite figure out the lab this week. Need to think more about this.

Monday 17 March 2014

Week 9

Exercise 3 went by rather easily as long as one thinks recursively, although it took a bit of thinking, particularly in the case of a two-element tree. In such a situation, I chose to resort to a rather a bit convoluted, semi-manual strategy.

Assignment 2 part 2 also ended up a little convoluted when it came to working with bar and dot regexs (is that the proper plural?), although it was also not too complicated. It helps that two totally separate functions have essentially the same structure however. In total, I spent about 5 hours on it.

Monday 10 March 2014

Week 8 - No Longer Considered Late (by definition)

From now one, works posted on Monday of the week after will no longer be considered late, by my new definition of the term.

The focus of the past week has been an part of an assignment based around the creation of a tree structure, and an exercise to create and analyze trees. Both went on without considerable difficulty. The tree exercises did however, emphasize the importance of thinking about recursive problems recursively, rather they trying to get a closed solution for them.

Monday 3 March 2014

Week 7 - Recursion (Slightly late again)

The topic of this week is recursion. To sum it up simply, recursion is the process of calling a function within itself, although it can more accurately be described as repeatedly performing the same action on a certain object. In computer science recursion and loops are generally seen as two ways of doing the same job in most cases. Unlike with loops however (unless somebody uses "while True"), one must be careful with recursion to make sure there is a stop condition in the function. If one doesn't, they run the risk of running out of memory space, which, suffice to say, may cause issues. For some situations, such as when, one is going over a nested list of unknown depth, recursively going over the list is far simpler and more efficient than using loops.

To recursion, if I need to find out the minimum number of moves needed to complete a Tower of Hanoi game with x number of blocks and 4 pegs,

def best_split(n: int) -> int:
    """Determine the optimal value to split at."""
    choices = [i for i in range(1, n)]
    num_moves = []
    if n == 1:
        return (1,1)
    else:
        num_moves = [(2 * best_split(n - i)[0] + 2 ** i - 1) for i in choices]
        optimal_i = choices[num_moves.index(min(num_moves))]
        return (min(num_moves), optimal_i)

Note that this function also returns the best value to split the tower at, but that is largely irrelevant to the current discussion. As can be seen, I call the function within itself at line 8. I also have an appropriate stop condition where the function will stop running at line 5. Both are necessary for a recursive function.

Recursion has, at this point, permeated so much into my life that I think about iterative problems recursively.

Sunday 23 February 2014

Week 7 - Nothing to report

Reading week. Have a test and assignment coming up, but that can wait. All quiet on the western front. Got an interesting question from a younger friend regarding breadth-first searching though. Searching it up, it seems rather pointless. Will need to investigate further.

Saturday 15 February 2014

Week 6 (apparently)

The topic once again falls back to recursion. More specifically, Assignment 1. After experimenting with a few different ways to solve the Tower of Hanoi with four stools in the fewest moves possible (with varying levels of success), I managed to find, with great difficulty, the minimal number of moves and the optimal split value i recursively. Unfortunately however, in this configuration, I was not able to actually move the cheeses in any stacks totalling more than ten blocks without attempting an illegal move, something I was able to accomplish with more basic splitting methods. It certainly did not help that my computer decided to die on the due date, but fortunately I had already submitted the files beforehand. And people wonder why I hate computers....