# Captain's Log: Introduction to Algorithm Analysis, Spring 2014

1: January 22 (Wednesday)
• go through syllabus
• emphasize that the task of an algorithm is to implement a specification
• specify the sorting and the selection problems.
Cormen reading: Chapter 1.
2: January 24 (Friday)
Using the insertion sort method as example, we
• illustrate how to design algorithms by the top-down approach
• show how to improve the implementation by applying the bottom-up technique
• give some simple complexity estimates.
Cormen reading: Sections 2.1-2.2.
3: January 27 (Monday)
We address, and illustrate on various implementations of insertion sort, how to prove (as is the topic of Assignment 1) the correctness of
• a recursive algorithm (this requires induction), cf. Section 2.2 in Howell
• an iterative algorithm (this requires a loop invariant), cf. Cormen p.19 or Howell Sect 2.3.
4: January 29 (Wednesday)
We further discuss Assignment 1, due Thursday night:
• for the recursive implementation, we do a correctness proof for a slightly different problem;
• for the iterative implementation, we encourage that the only way to exit from a loop is through its test being false.
We then finish the correctness proof of iterative insertion sort, treating also the inner loop for which it is not obvious how to find an invariant that is strong enough.
5: January 31 (Friday)
• Discuss model solutions for Assignment 1 (uploaded on K-State Online).
• Introduce the "Dutch National Flag" problem, cf. Section 2.4 in Howell, argue it is useful in a number of contexts, and develop a linear-time constant-space algorithm.
• Motivate asymptotic analysis, cf. Cormen Chapter 3.
6: February 3 (Monday)
We introduce the theory of asymptotic notation, cf. Cormen Chapter 3, and
• introduce the "Big-O" symbol
• reason about the properties of Big-O
• introduce "Big-Omega" and "Big-Theta".
February 5 (Wednesday)
University closed due to snow.
7: February 7 (Friday)
Finish the theory of asymptotic notation:
• Reason more about the properties of Big-O (and Big-Theta)
• Introduce "little-o" and "little-omega", and reason about how the various symbols interact
Discuss Assignment 2, due Monday night (where this web site may come handy).
8: February 10 (Monday)
We address the complexity of iterative algorithms where we need to estimate sums:
• Upper approximations are straightforward
• Lower approximations require that the functions are smooth.
• Also introduce the notion of a "barometer instruction" which is executed at least as frequently as any other instruction and therefore may serve to easily estimate running time.
• Discuss Assignment 3, due later this week.
Assignment 2 is due tonight.
Last day for 100% refund.
9: February 12 (Wednesday)
• Mention various pitfalls in algorithm analysis:
• It may be that not all iterations can exhibit worst-case behavior
• Numbers may become so big that they cannot realistically be handled in constant time
• Along the way, show how to transform the recursive (and inefficient) fibonacci function into an iterative (and efficient) version.
• See more examples of how to analyze nested loops.
Discuss model solutions for Assignment 2, Q1.
Further discuss Assignment 3, due tomorrow night.
10: February 14 (Friday)
Address general recurrences:
• they show up when analyzing divide and conquer algorithms
• to check a proposed solution, the substitution method can be used.
Cormen reading: Section 4.3.
11: February 17 (Monday)
We give a general recipe, the "Master Theorem", for solving recurrences:
• we give a (very) informal proof
• we illustrate by several examples
• Cormen reading: Sections 4.4 & 4.5.
Discuss the upcoming Assignment 4.
Discuss model solutions for Assignment 2, Q2 and Q3, also proving (cf. Q1) that log(n!) in Theta(n log(n)).
12: February 19 (Wednesday)
Elaborate on the Master Theorem:
• we show how recurrences can sometimes be twisted, by "change of variables", to fit the Master Theorem template.
• we give an example where Howell's Theorem 3.32 gives better precision than the Master Theorem from the slides.
Introduce graphs, cf. Cormen Appendix B.4, which
• are either directed or undirected
• may be acyclic and/or connected
13: February 21 (Friday)
• Elaborate on trees, and rooted trees, cf. Cormen Appendix B.5.
• Discuss how graphs can be represented by an adjacency matrix or by adjacency lists, cf Cormen 22.1.
• Discuss model solutions for Assignment 3.
• Argue that (comparison-based) sorting cannot be done better than O(n!) = O(n lg(n)).
Extension: Assignment 4 due tonight.
14: February 24 (Monday)
Review session for the upcoming midterm exam:
• Discuss model solutions for Assignment 4
• Discuss relevant exam questions from last year (uploaded on K-State Online)
15: February 26 (Wednesday)
Exam I.
16: February 28 (Friday)
Brief summary of general principles for Data Structures:
• correctness methodology
• immutable structures (lists).
Introduce the heap data structure
• use heaps to implement priority queues
• show an efficient representation of heaps
Cormen: Sections 6.1, 6.2, 6.5.
17: March 3 (Monday)
Continue heaps:
• show how to convert an array into a heap
• so as to develop an efficient and in-place sorting algorithm, heap sort.
Cormen: Sections 6.3 & 6.4.
Yang Xue's office hours are Tuesday, 3:30pm-5pm.
18: March 5 (Wednesday)
We introduce union/find structures (cf Cormen Chapter 21, Sections 1-3):
• they are useful for representing equivalence classes
• we can implement the operations to run in time O(lg(n)).
Further discuss Assignment 5, due tomorrow night.
19: March 7 (Friday)
Finish Union/Find structures:
• it is possible to obtain an "amortized" (average) cost per operation which is almost in O(1).
Explore the Divide and Conquer paradigm, with motivating examples:
• merge sort (Cormen Section 2.3)
• quicksort (Cormen Section 7.1)
Discuss and hand back graded Exam 1.
20: March 10 (Monday)
• Discuss the submitted Assignment 5.
• Discuss the upcoming Assignment 6.
• Discuss how to properly design Quicksort (Cormen Section 7.2), with pivot selection being the key part.
21: March 12 (Wednesday)
Continue the Divide & Conquer paradigm: we see clever applications that allow
• multiplication of n-bit integers (or polynomials of degree n) in time O(n^b) where b is less than 2 (and may actually be arbitrarily close to 1);
• multiplication of n x n matrices in time O(n^b) where b is much less than 3 (Cormen Section 4.2).
Further discuss Assignment 6, due tomorrow night.
22: March 14 (Friday)
Mention the quests to
• improve the complexity of multiplying large matrices
• improve the upper bound for the number of flips necessary to solve the "pancake problem".
One more application of the Divide and Conquer paradigm:
• the selection problem which has a clever solution that always runs in linear time (Cormen Section 9.3).
We discuss model solutions for Assignment 6, Q1 & Q2.
March 17-21
Spring break.
23: March 24 (Monday)
• Discuss model solutions for Assignment 6, Q3.
• Discuss the upcoming Assignment 7.
• Briefly motivate dynamic programming, the topic of Cormen 15, by giving examples of how to avoid duplicated computations.
24: March 26 (Wednesday)
Introduce dynamic programming, the topic of Cormen 15, which often allows an efficient implementation of exhaustive search.
• First example: giving optimal change (cf. Howell 12.1)
Briefly further discuss Assignment 7, due tomorrow night.
25: March 28 (Friday)
One more application of Dynamic Programming:
• the binary knapsack problem (Howell 12.4), given as Exercise 16.2-2 in Cormen and with solution listed on page 50 here.
Discuss model solutions for Assignment 7.
26: March 31 (Monday)
Continue Dynamic Programming:
• Discuss Assignment 8, due Thursday night.
• Present Floyd's all-pair shortest paths algorithm (Cormen 25.2)
27: April 2 (Wednesday)
Continue Dynamic Programming:
• Analyze and illustrate Floyd's all-pair shortest paths algorithm (Cormen 25.2)
• Motivate the problem of chained matrix multiplication (Cormen 15.2)
Further discuss Assignment 8, due tomorrow night.
April 4 (Friday)
Class canceled (All-University Open House)
28: April 7 (Monday)
Review session for the upcoming midterm exam:
• Discuss model solutions for Assignment 8
• Discuss exam questions from last year (uploaded on K-State Online)
Use dynamic programming to find optimal solution for chained matrix multiplication (Cormen 15.2)
29: April 9 (Wednesday)
Exam II.
30: April 11 (Friday)
We embark on greedy algorithms, the topic of Cormen 16:
• motivate the concept
• show a greedy strategy for minimizing total waiting time
• show a greedy strategy for the fractional knapsack problem (cf. Cormen Exercise 16.2-1)
• illustrate Kruskal's algorithm for constructing a minimum-cost spanning tree, cf. Cormen 23
31: April 14 (Monday)
Continue algorithms for finding minimum-cost spanning trees:
• sketch a correctness proof, cf. Cormen Theorem 23.1
• analyze the complexity of Kruskal's algorithm, cf. Cormen 23.2
• present Prim's algorithm, and analyze its complexity, cf. Cormen 23.2
Discuss the upcoming Assignment 9
32: April 16 (Wednesday)
• Present Dijkstra's algorithm for single-source shortest paths, cf. Cormen 24.3
• Further discuss Assignment 9, due tomorrow night.
33: April 18 (Good Friday)
More applications of the greedy paradigm:
• Scheduling of events with fixed start/end time (where the most obvious greedy strategy doesn't work), cf. Cormen 16.1
• Huffman codes, cf. Cormen 16.3
Discuss model solutions for Assignment 9, and movitate the upcoming Assignment 10.
34: April 21 (Monday)
Introduce depth-first traversal of graphs, directed as well as undirected (Cormen 22.3).
• As a result, we can view a graph as a rooted tree together with certain kinds of extra edges.
Discuss the upcoming Assignment 10.
35: April 23 (Wednesday)
See several graph problems that can be solved by algorithms that build on depth-first traversal:
• Topological sort, cf. Cormen 22.4.
• Find "articulation points", cf. Cormen Problem 22-2.
Assignment 10 due tomorrow night.
36: April 25 (Friday)
• Discuss and hand back graded Exam 2.
• Discuss model solutions for Assignment 10.
• Illustrate with a larger example the algorithm, based on depth-first search, for finding articulation points.
37: April 28 (Monday)
Embark on Network Flow, cf. Cormen 26.1-3:
• introduce key concepts, cf. Cormen 26.1
• show that a naive algorithm doesn't always generate the maximal flow
• introduce the Ford-Fulkerson method, cf. Cormen 26.2.
Discuss how the "Nearest Neighbor" heuristics may be arbitrarily bad for the Traveling Salesman Problem even when the triangle inequality holds.
38: April 30 (Wednesday)
Continue Network Flow:
• Recall the Ford-Fulkerson method, and introduce the Edmonds-Karp optimization, cf. Cormen 26.2.
• Briefly introduce the notion of a cut in a flow network.
Discuss Question 1 in the upcoming Assignment 11.
39: May 2 (Friday)
Finish Network Flow:
• Show how to use the Ford-Fulkerson method for Maximum Bipartite Matching, cf. Cormen 26.3
• Show that the maximum flow equals the minimum cut, cf. Cormen's Theorem 26.6.
40: May 5 (Monday)
• Discuss Assignment 11, due tomorrow night.
• Mention that a subpart of an optimal solution is not necessarily an optimal solution to the subproblem.
• Discuss Q1-Q3 from Exam 1 for Spring 2012.
41: May 7 (Wednesday)
• Discuss model solutions for Assignment 11
• Discuss questions from Exams 1 (Q4), 2 and 3 for Spring 2012.
42: May 9 (Friday)
Discuss questions from the exams for
• Spring 2012: the final set
• Spring 2013: Q4 from set 2; Q4 & Q5 from final
May 15 (Thursday)
Final exam, 4:10-6:00pm

Torben Amtoft