What is an algorithm?
Sequence of computational steps that transform an input into an output
When playing cards, how would you sort your hand in ascending order?
Here is a more standard implementation of insertion sort
For every algorithm, we shall study the following properties.
Property which is true before each iteration of a loop.
Loop invariants are important tools to show correctness.
The subarray A[:i]
contains the first i
elements of the original array in sorted order.
Assume that A has n elements. If the cost of the i-th line is ci, what is total running cost?
Let's consider the best and worst case scenarios
i=1∑ni=2n(n+1)We are interested in T(n), where T is the time an algorithm takes to solve a problem of size n.
We shall only keep the asymptotic term:
T(n)=2n2−4n+10⟹T(n)=Θ(n2)Let f and g be positive functions.
f∈O(g) if there exist positive constants C and N such that
f(n)≤Cg(n),n≥N.As an abuse of notation, we shall write f(n)=O(n).
f∈Θ(g) if there exist positive constants c,C and N such that
cg(n)≤f(n)≤Cg(n),n≥N.As an abuse of notation, we shall write f(n)=Θ(n).
A
of n sorted elements, write an algorithm that would search for a given value. What's the running time?A
is an array of elements which are increasing then decreasing. Write an algorithm that finds the maximum value. What's its running time?
A common stategy is divide-and-conquer.
Sort the following array: [11, 6, 3, 24, 46, 22, 7]
Show that the recurrenceT(n)=2T(⌊n/2⌋)+Θ(n)has solution T(n)=O(nlogn).
Use the substitution method to show that each of the following recurrences has the asymptotic solution specified:
Show that Θ(n2) is the solution to the recurrence T(n)=4T(n/2)+n.
For each of the following recurrences, sketch its recursion tree, and guess a good asymptotic upper bound on its solution. Then use the substitution method to verify your answer.
Assume that T(n)=aT(n/b)+f(n), with a and b being integers. If f(n)=O(nlogba−ε) for some ε>0, then T(n)=Θ(nlogba).
Find the mistake in the reasoning below:
We can check that T(n)=O(n) satisfies the recurrence relation, as
T(n)≤2O(⌊2n⌋)+Θ(n)=O(n)+Θ(n)=O(n).Insertion sort | Merge sort | Quicksort | |
---|---|---|---|
In place | Yes | No | Yes |
Comparison | Yes | Yes | Yes |
Stable | Yes | Yes | No |
Worst case runtime | O(n2) | O(nlogn) | O(n2) |
Average case runtime | O(n2) | O(nlogn) | O(nlogn) |
Best case runtime | O(n) | O(nlogn) | O(nlogn) |
We shall see later that O(nlogn) is the best we can do for comparison sorts.
Sort the following array: [11, 6, 3, 24, 46, 22, 7]
The real quicksort is in place. Here, we use the first element as pivot.
Apply the Hoare partition scheme on [11, 6, 3, 24, 46, 22, 7]
Illustrate the partition operation on the array
A=[13,19,9,5,12,8,7,4,21,2,6,11]
Without sorting, implement an algorithm that finds the k-th smallest element.
The algorithm must be in place and O(nlogn) on average.
The algorithm may perform poorly. One idea would be to choose a random pivot.
Implement Quicksort with a random pivot.
Fast multiplication of large numbers is essential in CAS, cryptography, etc.
For a long time, people thought that the multiplication algorithm we learn in primary school was optimal.
Assume we have two numbers with n digits.
Assume a,b,c,d are digits and calculate the product (10a+b)×(10c+d). Compare the result with (a+b)(c+d)
Hence, use this to calculate 42×14 using 3 multiplications.
Use Карацу́ба's trick to calculate 27×32 in 3 multiplications.
Calculate the runtime of Karatsuba's algorithm.
What about if we only did divide-and-conquer without applying Karatsuba's trick?
Hint: alogbc=clogba
Given n distinct points p1,…,pn∈R2, find i,j such that
dist(pi,pj)=i,j=1,…,ni=jmindist(x,y)
Is it possible to do better than O(n2) for Step III?
If the closest pairs in each sides have distances δ1,δ2, we only need to look at a distance δ=defmin{δ1,δ2} from the line.
Sort these points by their y-coordinate (O(nlogn)). We now need only check each pair against their immediate neighbours.
Assume s1,…,sk are sorted according to their y coordinate. If ∣i−j∣>11, then dist(si,sj)≥δ.
The runtime of our closest pair algorithm is O(nlog2n).
The running time of a comparison sorting algorithm is at best O(nlogn).
Hint: logn!=∑i=1nlogn≈∫1xlogx=Θ(nlogn)
3 | 2 | 2 | 7 | 2 | 4 | 6 | 1 | 8 | 1 |
0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 |
---|---|---|---|---|---|---|---|---|---|
The running time of counting sort is O(n+k), where k is the number of admissible elements.
A sorting algorithm is called stable if it preserves the order of records with equal keys.
Which of the following sorting algorithms are stable: insertion sort, merge sort, and quicksort? Give a simple scheme that makes any comparison sort stable. How much additional time and space does your scheme entail?
How would you transform any comparison algorithm into a stable algorithm?
Write a stable algorithm that sorts integers on their last digit.
An active open problem is the conjecture that a list of integers can be sorted in O(n). Currently, the best known algorithm sorts in O(nloglogn).
Counting sort is far from that conjecture. It is O(n) when k=O(n).
Use a stable counting sort to sort on the last digit, then on the penultimate, etc.
Sort the list
717,026,785,636,955,572,772,304,935,395.Use radix sort to sort the following words: COW, DOG, SEA, RUG, ROW, MOB, BOX, TAB, BAR, EAR, TAR, DIG, BIG, TEA, NOW, FOX.
Implement Radix sort in Python. What is the invariant? Prove the algorithm is correct. What is its runtime?
How would you sort n integers in the range 0 to n3−1 in O(n) time.
Runtime of counting sort on {1,…,k}
O(n+k).Runtime of radix sort on {1,…,k}:
O(log10k)O(n+10)=O(nlogk).Observe that for a general base b, radix sort works inO((n+b)logbk).
We can check that this is optimas as b=n.Prove the correctness of radix sort.
Calculate
∫01ei2πntdt.Discuss in terms of n.
Signals often take the shape:
f(t)=k∈Z∑ckei2πktHow could we find the values ck?
What happens if we integrate both sides?
Instead of an actual function, we'll often have a discretization, so that the most common definition is
Xk=defn=0∑N−1xne−N2πink