Captain's Log: Introduction to Algorithm Analysis, Spring 2019

1: January 23 (Wednesday)


go through the
syllabus

emphasize that the task of an algorithm is to
implement a specification

specify the problems of integer square root, and
(most importantly!) sorting.
Cormen reading: Chapter 1.

2: January 25 (Friday)


Specify two key problems:

sorting (we translate English contract into logic)

selection.

Using the insertion sort method
(Cormen Sections 2.12.2)
as example, we

illustrate how to design algorithms by the topdown
approach

show how to improve the implementation
by applying the bottomup technique

give some simple complexity estimates.

Give practice quizzes using the TopHat system.

Assignment 1 out.

3: January 28 (Monday)

Embark on how to prove the correctness of algorithms:

for iterative algorithms, cf. Cormen p.19 or Howell Sect 2.3,
this requires a loop invariant, as illustrated by

the fibonacci example

a program finding the last occurrence of an element in an array

a program that uses binary search to locate a number in a sorted array
(the upcoming Assignment 1)

4: January 30 (Wednesday)

Continue the correctness of iterative algorithms: we

outline how to handle
insertion sort (Cormen p.19 or Howell Sect 2.3)
where for the inner loop
it is not obvious how to find an invariant that is strong enough

construct (in quizzes)
an algorithm for integer division

further discuss
Assignment 1, due tomorrow night.

5: February 1 (Friday)


Show how to develop an iterative algorithm together with
its proof:
integer devision, cf. the quiz last time.

Discuss how to prove
(using induction) the correctness of recursive algorithms,
cf. Section 2.2
in Howell

we apply the techniques to InsertionSort, in particular
InsertLast

Briefly motivate the upcoming Assignment 2.

6: February 4 (Monday)

Embark on the theory of asymptotic notation,
cf. Cormen Chapter 3

Motivate, cf Howell Section 3.1

Introduce the "BigO" symbol,
and reason about its properties, cf. Howell Section 3.2
Discuss the upcoming Assignment 2.

7: February 6 (Wednesday)

Continue (almost finish) asymptotic notation:

list additional properties of "BigO"

introduce "BigOmega" and "BigTheta"
(cf. Howell Section 3.3)
while distinguishing between
"bestcase" and "worstcase" interpretation
(cf. Cormen p.49)

introduce
"littleo" (and
"littleomega"), cf. Howell 3.10 or Cormen p.5051
Assignment 2 due tomorrow night.

8: February 8 (Friday)

Wrap up asymptotic notation:

explain how littleo, bigO, bigTheta, bigOmega, littleomega
are somewhat similar to
<, <=, =, >=, >.
We next address (as in Howell 3.46)
the complexity of iterative algorithms
where we need to estimate sums;
the "rectangle" is

trivially an upper approximation

in our initial example,
also a lower approximation, modulo a constant factor
Briefly motivate the upcoming Assignment 3.

9: February 11 (Monday)

We finish the estimate of sums:
as captured by Howell's Theorem 3.28,
the "rectangle" is

trivially an upper approximation

but in general, may not be a lower approximation
(even modulo a constant factor)
if the function "grows too fast"

yet is a lower approximation (modulo a constant factor)
if the function is smooth.
Discuss the upcoming Assignment 3.

10: February 13 (Wednesday)

Mention various pitfalls in algorithm analysis:

it may be that not all iterations can exhibit worstcase behavior

numbers may become so big that they cannot realistically
be handled in constant time

what is and what is not a "barometer instruction".
We motivate the need to recurrences

which show up when
analyzing divide and conquer algorithms
(such as merge sort),
cf Cormen Section 2.3
Further discuss Assignment 3, due tomorrow night.

11: February 15 (Friday)

We address how to solve recurrences;
to check a proposed solution,
the substitution method
(Cormen Section 4.3)
can be used:

we illustrate by one kind of example which
is rather straightforward

for another kind of example, the presence of logarithms requires
a careful phrasing of the induction hypothesis
(two approaches are possible)

12: February 18 (Monday)

Wrap up the substitution method:

for a third kind of example, one needs some creativity to
phrase the induction hypothesis.
Present a general recipe,
the "Master Theorem"
(cf, e.g, Cormen Sections 4.4 & 4.5)
for solving recurrences:

there are 3 cases, depending on how
the degree of the "overhead" compares to log_b(a)
Discuss the upcoming Assignment 4.

13: February 20 (Wednesday)

Elaborate on the Master Theorem:

we illustrate by several examples

we give a (very) informal proof

we give an example where
Howell's Theorem 3.32
(or Wikipedia)
gives better precision
than the Master Theorem from the slides
(whereas Cormen's version gives nothing).
Briefly discuss Assignment 4, due tomorrow night.

14: February 22 (Friday)

Wrap up recurrences:

we show how recurrences can sometimes be twisted, by "change
of variables" (cf. p.86 in Cormen), to fit the Master Theorem template
Cover the "Dutch National Flag" problem,
cf. Section 2.4 in Howell:

argue it is useful in a number of contexts

develop a lineartime constantspace algorithm.

15: February 25 (Monday)

Review session for the upcoming midterm exam:

discuss solutions to Assignments 1, 2, 3, 4

discuss the questions from the sample exam
(from 2017) uploaded on Canvas.

16: February 27 (Wednesday)

Exam I.

17: March 1 (Friday)

Introduce graphs, cf Cormen Appendix B.4, which

are either directed or undirected

may be acyclic and/or connected
An important special case is a (rooted) tree,
cf Cormen Appendix B.5.

18: March 4 (Monday)

Wrap up graphs:

they may be represented by an adjacency matrix or by adjacency lists,
cf Cormen 22.1

discuss in much detail the upcoming Assignment 5.

19: March 6 (Wednesday)

Introduce the heap data structure, cf Cormen Chapter 6:

motivate by discussing how to implement
a priority queue

define the heap property

show how to maintain the heap property (Cormen 6.2)

use heaps to implement priority queues (Cormen 6.5)

show an efficient representation (Cormen 6.1)

show how to convert an array into a heap (Cormen 6.3)
Assignment 5 due tomorrow night.

20: March 8 (Friday)

Wrap up heaps:

Analyze the conversion of an array into a heap (Cormen 6.3):
it runs in linear time,
as one can show
by a recurrence and/or a summation.

Develop an efficient and inplace sorting
algorithm, heapsort (Cormen 6.4).
Explore the Divide and Conquer paradigm:

we have already seen one example: merge sort (Cormen Section 2.3)

today we introduce quicksort (Cormen Section 7.1)
Discuss solutions to Assignment 5.

March 1115

Spring break.


21: March 18 (Monday)


Discuss, and give back, graded Exam 1

Discuss the upcoming Assignment 6

22: March 20 (Wednesday)

Elaborate on sorting:

analyze
quicksort (Cormen Section 7.1);
the key part to
good performance (Cormen Section 7.2)
is to properly design pivot selection

mention several reasons,
listed on p.148 in Cormen,
why one may consider sorting the most fundamental
problem in the study of algorithms

argue that any comparisonbased sorting algorithm requires
\Omega(lg(n!)) = \Omega (n lg (n)) comparisons in the worst case
(Cormen Theorem 8.1).
Further discuss Assignment 6, due tomorrow night.

23: March 22 (Friday)

We address how to multiply nbit integers
(or polynomials of degree n, cf Howell Section 10.1)

recall (much as in primary school)
how two ndigit numbers can be multiplied
by doing 4 multiplications of n/2digit numbers (and then some
addition)

show that it will actually suffice with 3 multiplications
(at the price of doing more additions but this doesn't affect
asymptotic complexity)

hence multiplication can be done
in time O(n^b) where b is less than 2
(and may actually be arbitrarily
close to 1, by generalizing the method).
We study the multiplication of n x n matrices (Cormen Section 4.2):

the natural implementation of the definition takes time in O(n^3)

a naive use of Divide & Conquer also takes time in O(n^3)

a clever idea (Strassen) provides for a running time that is subcubic

there is a continuing quest to further improve the running time.
Briefly motivate the upcoming Assignment 7.

24: March 25 (Monday)


Discuss various items brought up in the midterm survey.

Discuss in detail the upcoming Assignment 7.

25: March 27 (Wednesday)

One more application of the Divide & Conquer paradigm:

the selection problem
which has a clever (deterministic) solution that
always runs in linear time
(Cormen Section 9.3).
Briefly further discuss Assignment 7, due tomorrow night.

26: March 29 (Friday)

Introduce
dynamic programming,
the topic of Cormen 15,
which often allows an efficient implementation of
exhaustive search:

motivate by examples of how to avoid duplicated computations

show how to apply it to the problem of giving optimal change,
cf. Howell 12.1.
Very briefly motivate the upcoming Assignment 8.

27: April 1 (Monday)

Continue dynamic programming:

recall how to apply it to giving optimal change

illustrate how to apply it to the problem of optimal alignment
of sequences, as in the upcoming Assignment 8

28: April 3 (Wednesday)

Continue dynamic programming:

apply it to the binary knapsack problem
(Howell 12.4),
given as Exercise 16.22 in Cormen
and with solution listed on page 50
here

Present Floyd's allpair shortest paths
algorithm (Cormen 25.2, or Howell 12.3)
Assignment 8 due tomorrow night.

April 5 (Friday)

Class canceled (AllUniversity Open House)

29: April 8 (Monday)

Review session for the upcoming midterm exam:

wrap up Floyd's allpair shortest path algorithm

discuss solutions to Assignments 6,7,8

discuss the questions from the sample exam
(from 2017) uploaded on Canvas.

30: April 10 (Wednesday)

Exam II.

31: April 12 (Friday)

Introduce depthfirst traversal of graphs, directed as well
as undirected (Cormen 22.3):

show how to compute a spanning tree

show that we can view a graph as a rooted tree together with
certain kinds of extra edges (back, forward, cross)

show the generic depthfirst traversal algorithms
from which many useful algorithms can be derived

32: April 15 (Monday)

The depthfirst traversal algorithm may be augmented in several ways,
and for undirected graphs be used to

find "articulation points",
cf. Cormen Problem 222

find "bridges",
cf. the upcoming Assignment 9.

33: April 17 (Wednesday)

The depthfirst traversal algorithm
can also be augmented for directed graphs:

we show how to do topological sort, cf. Cormen 22.4
Embark on Network Flow, cf. Cormen 26.13:

introduce key concepts, cf. Cormen 26.1

show that a naive algorithm doesn't always generate the maximal flow

hence motivate the FordFulkerson method,
cf. Cormen 26.2,
which uses the concept of residual networks
Assignment 9 due tomorrow night.

34: April 19 (Good Friday)


Discuss, and give back, graded Exam 2.

We see another example of how to use the FordFulkerson
method to find the optimal flow in a network.

35: April 22 (Monday)

We continue the study of flow networks:

we analyze the FordFulkerson method; it
can often be improved by the EdmondsKarp algorithm

we introduce the notion of a cut
and demonstrate by an example that the maximum flow equals the minimum cut,
cf. Cormen's Theorem 26.6.
Discuss the upcoming Assignment 10.

36: April 24 (Wednesday)

Finish the study of flow networks:

we see how
Maximum Bipartite Matching,
cf. Cormen 26.3, can be solved
as a special case
We embark on greedy algorithms,
the topic of Cormen 16:

motivate the concept

show a greedy strategy for minimizing total waiting time
for jobs with duration

show a greedy strategy for
the fractional knapsack problem
(cf. Cormen Exercise 16.21).
Assignment 10 due tomorrow night.

37: April 26 (Friday)

We address how to construct a
minimumcost spanning tree, cf.
Cormen 23:

present and illustrate Kruskal's algorithm,
and analyze its complexity,
cf. Cormen 23.2

present and illustrate Prim's algorithm, and analyze its complexity,
cf. Cormen 23.2

briefly mention a general correctness proof, cf. Cormen Theorem 23.1.
Briefly motivate the upcoming Assignment 11.

38: April 29 (Monday)

Present Dijkstra's algorithm (cf. Cormen 24.3) for singlesource shortest paths:

illustrate it by the example in Cormen Fig 24.6

analyze its running time

discuss how to use it, and Floyd's algorithm,
to find shortest paths for single/multiple sources
and single/multiple destinations
Discuss the upcoming Assignment 11, in particular the scheduling
problem in Question 1.

39: May 1 (Wednesday)

Introduce union/find structures
(cf Cormen Chapter 21, Sections 13):

they are useful for representing equivalence classes
(we recall various properties of relations: reflexive, symmetric, transitive, etc.)

we can implement the operations so that we can prove they run in time O(lg(n)).
Briefly further discuss Assignment 11 (last of the semester),
due tomorrow night.

40: May 3 (Friday)

Mention that one can implement union/find such that the
"amortized" (average)
cost per operation is

"almost" in O(1), or to be more specific:

the inverse of the Ackermann function
introduced in Assignment 10, Q2.
We look at how to schedule events with fixed start/end time,
cf. Cormen 16.1:

we see that some strategies that may appear
optimal are in fact not

but we see that it's optimal to first schedule
the event with earliest finish time (or latest start time).
Discuss solutions to Assignments 10, and 11(Q2).

41: May 6 (Monday)

Wrap up greedy algorithms:

discuss solutions for question 1 in the recent Assignment 11

introduce Huffman codes, cf. Cormen 16.3
Some general observations:

a subpart of an optimal solution
is not necessarily an optimal solution to the subproblem
(cf. Cormen 15.3)

for a chain of matrix multiplications,
the order may matter (a lot).

May 8 (Wednesday)

Class canceled

42: May 10 (Friday)

Wrap up dynamic programming:
find an optimal solution for chained matrix multiplication (Cormen 15.2)
Prepare for final exam:

discuss the questions from the 2017 final exam
Discuss solutions to Assignment 9

May 16 (Thursday)

Final exam, 4:106:00pm
Torben Amtoft