Captain's Log: Introduction to Algorithm Analysis, Spring 2014

1: January 22 (Wednesday)


go through syllabus

emphasize that the task of an algorithm is to
implement a specification

specify the sorting and the selection problems.
Cormen reading: Chapter 1.

2: January 24 (Friday)

Using the insertion sort method as example,
we

illustrate how to design algorithms by the topdown
approach

show how to improve the implementation
by applying the bottomup technique

give some simple complexity estimates.
Cormen reading: Sections 2.12.2.

3: January 27 (Monday)

We address, and illustrate on
various implementations of insertion sort,
how to prove
(as is the topic of Assignment 1)
the correctness of

a recursive algorithm (this requires induction),
cf. Section 2.2
in Howell

an iterative algorithm (this requires a loop invariant),
cf. Cormen p.19 or Howell Sect 2.3.

4: January 29 (Wednesday)

We further discuss Assignment 1, due Thursday night:

for the recursive implementation, we do a correctness proof
for a slightly different problem;

for the iterative implementation, we
encourage that the only way to exit from a loop is
through its test being false.
We then finish the correctness proof of iterative insertion sort,
treating also the inner loop for which
it is not obvious how to find an invariant that is strong enough.

5: January 31 (Friday)


Discuss model solutions for Assignment 1 (uploaded on KState Online).

Introduce the "Dutch National Flag" problem,
cf. Section 2.4 in
Howell,
argue it is useful in a number of contexts,
and
develop a lineartime constantspace algorithm.

Motivate asymptotic analysis, cf. Cormen Chapter 3.

6: February 3 (Monday)

We introduce the theory of asymptotic notation, cf. Cormen
Chapter 3, and

introduce the "BigO" symbol

reason about the properties of BigO

introduce
"BigOmega" and
"BigTheta".

February 5 (Wednesday)

University closed due to snow.

7: February 7 (Friday)

Finish the theory of asymptotic notation:

Reason more about the properties of BigO
(and BigTheta)

Introduce
"littleo" and
"littleomega", and
reason about how the various symbols interact
Discuss Assignment 2, due Monday night
(where this web site
may come handy).

8: February 10 (Monday)

We address the complexity of iterative algorithms
where we need to estimate sums:

Upper approximations are straightforward

Lower approximations require that the
functions are smooth.

Also introduce the notion of a "barometer instruction"
which is executed at least as frequently as any other instruction
and therefore may serve to easily estimate running time.

Discuss Assignment 3, due later this week.
Assignment 2 is due tonight.
Last day for 100% refund.

9: February 12 (Wednesday)


Mention various pitfalls in algorithm analysis:

It may be that not all iterations can exhibit worstcase behavior

Numbers may become so big that they cannot realistically
be handled in constant time

Along the way,
show how to transform the recursive (and inefficient)
fibonacci function into an iterative (and efficient) version.

See more examples of how to analyze nested loops.
Discuss model solutions for Assignment 2, Q1.
Further discuss Assignment 3, due tomorrow night.

10: February 14 (Friday)

Address general recurrences:

they show up when
analyzing divide and conquer algorithms

to check a proposed solution,
the substitution method can be used.
Cormen reading: Section 4.3.

11: February 17 (Monday)

We give a general recipe,
the "Master Theorem",
for solving recurrences:

we give a (very) informal proof

we illustrate by several examples

Cormen reading: Sections 4.4 & 4.5.
Discuss the upcoming Assignment 4.
Discuss model solutions for Assignment 2, Q2 and Q3,
also proving (cf. Q1) that log(n!) in Theta(n log(n)).

12: February 19 (Wednesday)

Elaborate on the Master Theorem:

we show how recurrences can sometimes be twisted, by "change
of variables", to fit the Master Theorem template.

we give an example where
Howell's Theorem 3.32 gives better precision
than the Master Theorem from the slides.
Introduce graphs, cf. Cormen Appendix B.4, which

are either directed or undirected

may be acyclic and/or connected

13: February 21 (Friday)


Elaborate on trees, and rooted trees,
cf. Cormen Appendix B.5.

Discuss how graphs can be
represented by an adjacency matrix or by adjacency lists,
cf Cormen 22.1.

Discuss model solutions for Assignment 3.

Argue that (comparisonbased) sorting cannot be done
better than O(n!) = O(n lg(n)).
Extension:
Assignment 4 due tonight.

14: February 24 (Monday)

Review session for the upcoming midterm exam:

Discuss model solutions for Assignment 4

Discuss relevant exam questions
from last year (uploaded on KState Online)

15: February 26 (Wednesday)

Exam I.

16: February 28 (Friday)

Brief summary of general principles for Data Structures:

correctness methodology

immutable structures (lists).
Introduce the heap data structure

use heaps to implement priority queues

show an efficient representation of heaps
Cormen: Sections 6.1, 6.2, 6.5.

17: March 3 (Monday)

Continue heaps:

show how to convert an array into a heap

so as to develop an efficient and inplace sorting
algorithm, heap sort.
Cormen: Sections 6.3 & 6.4.
Yang Xue's office hours are Tuesday,
3:30pm5pm.

18: March 5 (Wednesday)

We introduce union/find structures
(cf Cormen Chapter 21, Sections 13):

they are useful for representing equivalence classes

we can implement the operations to run in time O(lg(n)).
Further discuss Assignment 5, due tomorrow night.

19: March 7 (Friday)

Finish Union/Find structures:

it is possible to obtain an "amortized" (average)
cost per operation
which is almost in O(1).
Explore the Divide and Conquer paradigm,
with motivating examples:

merge sort (Cormen Section 2.3)

quicksort (Cormen Section 7.1)
Discuss and hand back graded Exam 1.

20: March 10 (Monday)


Discuss the submitted Assignment 5.

Discuss the upcoming Assignment 6.

Discuss how to properly design Quicksort (Cormen Section 7.2),
with pivot selection being the key part.

21: March 12 (Wednesday)

Continue the Divide & Conquer paradigm:
we see clever applications that allow

multiplication of nbit integers
(or polynomials of degree n)
in time O(n^b) where b is less than 2 (and may actually be arbitrarily
close to 1);

multiplication of n x n matrices in time
O(n^b) where b is much less than 3
(Cormen Section 4.2).
Further discuss
Assignment 6, due tomorrow night.

22: March 14 (Friday)

Mention the quests to

improve the complexity of multiplying large matrices

improve the upper bound for the number of flips necessary
to solve the "pancake problem".
One more application of the Divide and Conquer paradigm:

the selection problem
which has a clever solution that
always runs in linear time
(Cormen Section 9.3).
We discuss model solutions for Assignment 6, Q1 & Q2.

March 1721

Spring break.

23: March 24 (Monday)


Discuss model solutions for Assignment 6, Q3.

Discuss the upcoming Assignment 7.

Briefly motivate
dynamic programming,
the topic of Cormen 15,
by giving examples of how to avoid duplicated computations.

24: March 26 (Wednesday)

Introduce
dynamic programming,
the topic of Cormen 15,
which
often allows an efficient implementation of
exhaustive search.

First example: giving optimal change
(cf. Howell 12.1)
Briefly further discuss Assignment 7, due tomorrow night.

25: March 28 (Friday)

One more application of Dynamic Programming:

the binary knapsack problem
(Howell 12.4),
given as Exercise 16.22 in Cormen
and with solution listed on page 50
here.
Discuss model solutions for Assignment 7.

26: March 31 (Monday)

Continue Dynamic Programming:

Discuss Assignment 8, due Thursday night.

Present Floyd's allpair shortest paths
algorithm (Cormen 25.2)

27: April 2 (Wednesday)

Continue Dynamic Programming:

Analyze and illustrate Floyd's allpair shortest paths algorithm
(Cormen 25.2)

Motivate the problem of chained matrix multiplication (Cormen 15.2)
Further discuss Assignment 8, due tomorrow night.

April 4 (Friday)

Class canceled (AllUniversity Open House)

28: April 7 (Monday)

Review session for the upcoming midterm exam:

Discuss model solutions for Assignment 8

Discuss exam questions
from last year (uploaded on KState Online)
Use dynamic programming to find
optimal solution for chained matrix multiplication (Cormen 15.2)

29: April 9 (Wednesday)

Exam II.

30: April 11 (Friday)

We embark on greedy algorithms,
the topic of Cormen 16:

motivate the concept

show a greedy strategy for minimizing total waiting time

show a greedy strategy for
the fractional knapsack problem
(cf. Cormen Exercise 16.21)

illustrate Kruskal's algorithm
for constructing a
minimumcost spanning tree, cf.
Cormen 23

31: April 14 (Monday)

Continue algorithms for finding minimumcost spanning trees:

sketch a correctness proof, cf. Cormen Theorem 23.1

analyze the complexity of Kruskal's algorithm,
cf. Cormen 23.2

present Prim's algorithm, and analyze its complexity,
cf. Cormen 23.2
Discuss the upcoming Assignment 9

32: April 16 (Wednesday)


Present Dijkstra's algorithm for singlesource shortest paths,
cf. Cormen 24.3

Further discuss Assignment 9, due tomorrow night.

33: April 18 (Good Friday)

More applications of the greedy paradigm:

Scheduling of events with fixed start/end time
(where the most obvious greedy strategy
doesn't work), cf. Cormen 16.1

Huffman codes, cf. Cormen 16.3
Discuss model solutions for Assignment 9,
and movitate the upcoming Assignment 10.

34: April 21 (Monday)

Introduce depthfirst traversal of graphs, directed as well
as undirected (Cormen 22.3).

As a result, we can view a graph as a rooted tree together with
certain kinds of extra edges.
Discuss the upcoming Assignment 10.

35: April 23 (Wednesday)

See several graph problems that can be solved by
algorithms that build on depthfirst traversal:

Topological sort,
cf. Cormen 22.4.

Find "articulation points",
cf. Cormen Problem 222.
Assignment 10 due tomorrow night.

36: April 25 (Friday)


Discuss and hand back graded Exam 2.

Discuss model solutions for Assignment 10.

Illustrate with a larger example
the algorithm, based on depthfirst search, for finding articulation points.

37: April 28 (Monday)

Embark on Network Flow, cf. Cormen 26.13:

introduce key concepts, cf. Cormen 26.1

show that a naive algorithm doesn't always generate the maximal flow

introduce the FordFulkerson method, cf.
Cormen 26.2.
Discuss how the "Nearest Neighbor" heuristics
may be arbitrarily bad for the Traveling Salesman Problem
even when the triangle inequality holds.

38: April 30 (Wednesday)

Continue Network Flow:

Recall the FordFulkerson method, and
introduce the EdmondsKarp optimization, cf.
Cormen 26.2.

Briefly introduce the notion of a cut
in a flow network.
Discuss Question 1 in the upcoming Assignment 11.

39: May 2 (Friday)

Finish Network Flow:

Show how to use the FordFulkerson method for Maximum Bipartite Matching,
cf. Cormen 26.3

Show that the maximum flow equals the minimum cut,
cf. Cormen's Theorem 26.6.

40: May 5 (Monday)


Discuss Assignment 11, due tomorrow night.

Mention that a subpart of an optimal solution
is not necessarily an optimal solution to the subproblem.

Discuss Q1Q3 from Exam 1 for Spring 2012.

41: May 7 (Wednesday)


Discuss model solutions for Assignment 11

Discuss questions from Exams 1 (Q4), 2 and 3 for Spring 2012.

42: May 9 (Friday)

Discuss questions from the exams for

Spring 2012: the final set

Spring 2013: Q4 from set 2; Q4 & Q5 from final

May 15 (Thursday)

Final exam, 4:106:00pm
Torben Amtoft