Captain's Log: Introduction to Algorithm Analysis, Spring 2016

1: January 20 (Wednesday)


go through syllabus

emphasize that the task of an algorithm is to
implement a specification

specify the sorting and the selection problems.
Cormen reading: Chapter 1.

2: January 22 (Friday)

Using the insertion sort method
(Cormen Sections 2.12.2)
as example, we

illustrate how to design algorithms by the topdown
approach

show how to improve the implementation
by applying the bottomup technique

give some simple complexity estimates.
Embark on how to prove the correctness of algorithms:

for recursive algorithms, this requires induction,
cf. Section 2.2
in Howell.

3: January 25 (Monday)

We address, and illustrate on examples,
how to prove the correctness of

recursive algorithms (continued from last time)

iterative algorithms (this requires a loop invariant),
cf. Cormen p.19 or Howell Sect 2.3.
Discuss the upcoming Assignment 1.

4: January 27 (Wednesday)


Do the correctness proof of iterative insertion sort,
where for the inner loop
it is not obvious how to find an invariant that is strong enough.

Motivate the theory of asymptotic notation.

Further discuss Assignment 1, due tomorrow night.

5: January 29 (Friday)

We introduce the theory of asymptotic notation, cf. Cormen
Chapter 3 or Howell Sections 3.13.3, and

introduce the "BigO" symbol

reason about the properties of BigO

introduce "BigOmega" and "BigTheta"
while distinguishing between
"bestcase" and "worstcase" interpretation
(cf. Cormen p.49).
Motivate the upcoming Assignment 2
(this web site
may come handy).

February 1 (Monday)

Class canceled (instructor out of town).

6: February 3 (Wednesday)


Finish the theory of asymptotic notation,
by introducing
"littleo" (and
"littleomega").

Introduce the notion of a "barometer instruction"

which is executed at least as frequently as any other instruction
and therefore may serve to easily estimate running time.

We address (as in Howell 3.46) the complexity of iterative algorithms
where we need to estimate sums; the "rectangle" is

trivially an upper approximation

in our initial example,
also a lower approximation, modulo a constant factor.
Assignment 2 due tomorrow night.

7: February 5 (Friday)

Motivate, state and illustrate a general result (Theorem 3.28 in Howell)
that is often useful for estimating sums found in the
analysis of iterative programs:

the "rectangle" is
even a lower approximation (modulo a constant factor)
provided that the function is smooth.
Discuss graded Assignment 1.

8: February 8 (Monday)

Mention various pitfalls in algorithm analysis:

it may be that not all iterations can exhibit worstcase behavior

numbers may become so big that they cannot realistically
be handled in constant time.
Wrap up the discussion of iterative algorithms:

motivate the upcoming Assignment 3.
Address general recurrences:

they show up when
analyzing divide and conquer algorithms
(cf Cormen Section 2.3).
Last day for 100% refund

9: February 10 (Wednesday)

We address how to solve recurrences:

to check a proposed solution,
the substitution method can be used
(cf Cormen Section 4.3):

one kind of example is straightforward

for another kind of example, the presence of logarithms requires
a careful phrasing of the induction hypothesis
Assignment 3 due tomorrow night.

10: February 12 (Friday)


Discuss graded Assignment 2.

Wrap up the substitution method:

for a third kind of example, one needs some creativity to
phrase the induction hypothesis

Briefly introduce a general recipe,
the "Master Theorem",
for solving recurrences.

11: February 15 (Monday)

Present a general recipe,
the "Master Theorem",
for solving recurrences:

we give a (very) informal proof

we illustrate by several examples

Cormen reading: Sections 4.4 & 4.5.
Discuss graded Assignment 3.

12: February 17 (Wednesday)

Elaborate on the Master Theorem:

we show how recurrences can sometimes be twisted, by "change
of variables" (cf. p.86 in Cormen), to fit the Master Theorem template

we give an example where
Howell's Theorem 3.32
(or Wikipedia)
gives better precision
than the Master Theorem from the slides
(whereas Cormen's version gives nothing)
Assignment 4 due tomorrow night.

13: February 19 (Friday)

Introduce the "Dutch National Flag" problem,
cf. Section 2.4 in Howell:

argue it is useful in a number of contexts

develop a lineartime constantspace algorithm.

14: February 22 (Monday)

Review session for the upcoming midterm exam:

discuss graded Assignment 4

discuss relevant exam questions
from 2015 (uploaded on KState Online)

15: February 24 (Wednesday)

Exam I.

16: February 26 (Friday)

Introduce graphs, cf Cormen Appendix B.4, which

are either directed or undirected

may be acyclic and/or connected

may be represented by an adjacency matrix or by adjacency lists,
cf Cormen 22.1.
An important special case is a (rooted) tree,
cf Cormen Appendix B.5.

17: February 29 (Monday)

Introduce the heap data structure, cf Cormen Chapter 6:

define the heap property

show how to maintain the heap property (Cormen 6.2)

use heaps to implement priority queues (Cormen 6.5).
Motivate and discuss the upcoming Assignment 5.

18: March 2 (Wednesday)

Conclude the treatment of heaps:

show an efficient representation (Cormen 6.1)

show how to convert an array into a heap (Cormen 6.3)

so as to develop an efficient and inplace sorting
algorithm, heap sort (Cormen 6.4).
Further discuss Assignment 5, due tomorrow night.

19: March 4 (Friday)

Introduce union/find structures
(cf Cormen Chapter 21, Sections 13):

they are useful for representing equivalence classes

we can implement the operations so that we can prove they run in time O(lg(n))

March 7 (Monday)

Class canceled (instructor not well)

20: March 9 (Wednesday)


Hand back, and discuss, graded Exam 1.

Discuss Assignment 6, due tomorrow night.

21: March 11 (Friday)


Discuss graded Assignment 5;

discuss (not yet graded) Assignment 6.

Introduce Ackermann's function which quickly
assumes exorbitant values;

mention that one can implement union/find such that the
"amortized" (average)
cost per operation
is the inverse Ackermann function, that is almost in O(1).

March 1418

Spring break.

22: March 21 (Monday)

Explore the Divide and Conquer paradigm,
already motivated by merge sort (Cormen Section 2.3)
and with further examples:

quicksort (Cormen Section 7.1) where the key part to
good performance (Cormen Section 7.2)
is to properly design pivot selection.
Motivate, and discuss in detail, the upcoming Assignment 7.

23: March 23 (Wednesday)

We see some clever applications of the Divide & Conquer paradigm:

after recalling from primary school how to multiply big numbers
given that we know how to multiply small numbers,
we show how to multiply nbit integers
(or polynomials of degree n, cf Howell Section 10.1)
in time O(n^b) where b is less than 2 (and may actually be arbitrarily
close to 1);

multiplication of n x n matrices in time
O(n^b) where b is much less than 3
(Cormen Section 4.2).
Further discuss Assignment 7, due tomorrow night.

24: March 25 (Good Friday)

Introduce
dynamic programming,
the topic of Cormen 15,
which often allows an efficient implementation of
exhaustive search:

motivate by examples of how to avoid duplicated computations

show how to apply it to the problem of giving optimal change
(cf. Howell 12.1).
Motivate Assignment 8.

25: March 28 (Monday)

Continue dynamic programming:

apply it to the binary knapsack problem
(Howell 12.4),
given as Exercise 16.22 in Cormen
and with solution listed on page 50
here;

further discuss the upcoming Assignment 8, illustrated by a simple example.

26: March 30 (Wednesday)

Continue Dynamic Programming:

Present Floyd's allpair shortest paths
algorithm (Cormen 25.2)

mention that a subpart of an optimal solution
is not necessarily an optimal solution to the subproblem.

Further discuss Assignment 8, due tomorrow night.

27: April 1 (Friday)

One more application of the Divide and Conquer paradigm:

the selection problem
which has a clever (deterministic) solution that
always runs in linear time
(Cormen Section 9.3).
Motivate the chained matrix multiplication problem (Cormen 15.2).

April 48 (MondayFriday)

Classes canceled (instructor out of town).

28: April 11 (Monday)

Review session for the upcoming midterm exam:

Discuss relevant exam questions
from 2015 (uploaded on KState Online)

Discuss model solutions for Assignments 7 and 8.

29: April 13 (Wednesday)

Exam II.

April 15 (Friday)

Class canceled (AllUniversity Open House)

30: April 18 (Monday)

We embark on greedy algorithms,
the topic of Cormen 16:

motivate the concept

show a greedy strategy for minimizing total waiting time
for jobs with duration

discuss how to extend the above strategy when jobs have priorities,
as in the upcoming Assignment 9

show a greedy strategy for
the fractional knapsack problem
(cf. Cormen Exercise 16.21).

31: April 20 (Wednesday)

We address how to construct a
minimumcost spanning tree, cf.
Cormen 23:

present and illustrate Kruskal's algorithm,
and analyze its complexity,
cf. Cormen 23.2
Discuss and hand back graded Exam 2.
Assignment 9 due tomorrow night.

32: April 22 (Friday)

Introduce depthfirst traversal of graphs, directed as well
as undirected (Cormen 22.3):

show how to compute a spanning tree;

show that we can view a graph as a rooted tree together with
certain kinds of extra edges (back, forward, cross);

show how to augment the depthfirst algorithm
to do topological sort, cf. Cormen 22.4.
Motivate Question 2 in the upcoming Assignment 10.

33: April 25 (Monday)

Continue algorithms based on depthfirst traversal:

show how to find
"articulation points",
cf. Cormen Problem 222

motivate the notion of "bridges", cf
Question 1 in the upcoming Assignment 10.

34: April 27 (Wednesday)

Continue the construction of minimumcost spanning trees:

present and illustrate Prim's algorithm, and analyze its complexity,
cf. Cormen 23.2

briefly mention a general correctness proof, cf. Cormen Theorem 23.1.
Continue the detection of shortest paths:

present Dijkstra's algorithm for singlesource shortest paths,
cf. Cormen 24.3 (illustrated by Fig 24.6 there)

discuss how to use Dijkstra's algorithm, and Floyd's,
to find shortest paths for single/multiple sources
and single/multiple destinations.
Assignment 10 due tomorrow night.

35: April 29 (Friday)

Embark on Network Flow, cf. Cormen 26.13:

introduce key concepts, cf. Cormen 26.1

show that a naive algorithm doesn't always generate the maximal flow

hence motivate the FordFulkerson method,
cf. Cormen 26.2, which

uses the concept of residual networks

can be optimized by the EdmondsKarp algorithm.

36: May 2 (Monday)

Continue and finish Network Flow:

show how to use the FordFulkerson method for
Maximum Bipartite Matching,
cf. Cormen 26.3

introduce the notion of a cut
in a flow network

show that the maximum flow equals the minimum cut,
cf. Cormen's Theorem 26.6.

37: May 4 (Wednesday)

Wrap up Greedy Algorithms:

schedule events with fixed start/end time,
cf. Cormen 16.1

introduce Huffman codes, cf. Cormen 16.3.
Discuss graded Assignments 9 & 10.
Assignment 11 due tomorrow night.

38: May 6 (Friday)

Prepare for final exam:

discuss model solutions for Assignment 11

discuss relevant questions from the 2015 final exam

briefly discuss questions from the 2014 final exam.

May 9 (Monday)

Final exam, 4:106:00pm
Torben Amtoft