Friday, November 29, 2013

Sorting Algorithms

There are many different types of sorting algorithms that can be coded in a program and each offer their advantages and disadvantages. Efficiency is a measure of how the running time of a program increases compared to the load given (n), for example as the size of a list increases.

First there is Selection Sort. This is a type of sorting method where the minimum value is selected and then switched to the first index, and then then next minimum is found and switched to the second index, etc. This sort is very inefficient once the lists grow because it does not cut down the running time because no matter what order the list is in even if it is partially sorted, it has to go through all the values. The efficiency of this sorting method is O(n^2).

Secondly there is Quick Sort. This is a type of sorting method where a divide and conquer approach is taken splitting up the list into smaller sections and then recursively working through the sub parts. A pivot point is picked at random and the elements are sorted to the left if it is smaller, and to the right if it is larger. This method is very efficient as the worst case is O(n^2), which is identical to the Selection Sort, however the best case is O(n log n), which is significantly faster!

Lastly there is Merge Sort. This is a type of sorting method where the list is broken down into single elements first and then sorted into groups of two (L[0] + L[1], L[2] + L[3], etc). After all the elements are sorted into groups of two, the groups are then sorted using the same method into groups of four, this keeps happening until there are only two lists left that are merged together into one. This method has a worst case of O(n log n), which is even faster then Quick Sort because it's best case is better then that (in the cases that the sub lists are already sorted).

Thursday, November 21, 2013

Big O Notation

After learning a couple of different sorts available in Python, at first I was wondering why we are learning this as it is obvious that we can choose any of them and they will work perfectly! However, then came the reason, even though to us currently, 1 second is not a big difference when running our code, when you have a lot of different aspect each taking that 1 second and all other functions rely on them, it builds up considerably.

Dr. Heap then showed is different sorts and how they are labeled with Big O Notation and it finally made sense when people online are referring to their codes using O(n) or something like that. Now I understand the importance of logically solving the problems but ensuring that their running time are kept to the minimum when the size increases. Like the Lab when comparing Queues and Stacks.

Assignment Reflection

Finally completed the assignment that was due, and handed it in on time! Having read the instructions and listened to Dr. Heap, I was very scared approaching this assignment as I had no idea what the topic it was covering was about and how to approach it in the slightest way. Looking back on it however, I am very pleased with how it went and after learning thoroughly about recursion, it helped significantly.

At the beginning, I compiled a list of cases from my own examples as well as compiled some from piazza that were contributed by Dr. Heap and some from Chance. After I had these I proceeded to write test cases for the program beginning with the RegexTree construction. After approaching the problem, it lead to itself just being typed out, seemed fairly straightforward and I was actually surprised by the speed in which I was able to code it up.

Then came the match section. That proved to be the challenging portion of the assignment, but I approached it the same way. Thinking about cases and typing them up so I could track my progress from easy to expert examples. I approached the problem from solving very simple RegexTrees a.k.a single RegexTreeNodes. Then approached the intermediate areas such as single BarNodes, StarNodes and DotNodes. Lastly I made sure that all the No