Week 2: Divide and Conquer Algorithms
Introduction to divide and conquer paradigm
●
●
Divide and conquer is a problem-solving technique that breaks a complex problem
into smaller subproblems that are easier to solve.
The general idea is to divide the problem into two or more subproblems of the same
or related type, recursively solve them, and then combine their solutions to obtain a
solution for the original problem.
Divide and conquer algorithms have three main steps
1. Divide: Split the problem into smaller subproblems of the same type.
2. Conquer: Solve each subproblem recursively. If the subproblem size is small enough,
solve it directly.
3. Combine: Merge the solutions of the subproblems to obtain the solution for the
original problem.
Recursive algorithms and their analysis
●
●
A recursive algorithm is an algorithm that calls itself with smaller or simpler inputs
until it reaches a base case, where it can be solved directly without further recursion.
Recursive algorithms are often used to implement divide and conquer algorithms, as
they can easily express the recursive structure of the problem.
To analyze the performance of a recursive algorithm, we need to consider two
factors
1. The time complexity of each recursive call, which depends on the size of the input
and the operations performed in the call.
2. The number of recursive calls, which depends on how the input is divided and how
the recursion tree grows.
Examples: Merge sort, Quick sort, and Binary search
●
●
Merge sort is a divide and conquer algorithm that sorts an array of n elements by
recursively dividing it into two halves, sorting each half, and then merging them
together in linear time.
Quick sort is a divide and conquer algorithm that sorts an array of n elements by
recursively partitioning it around a pivot element, such that all elements smaller than
the pivot are in the left part and all elements larger than the pivot are in the right
part, and then sorting each part. ●
Binary search is a divide and conquer algorithm that searches for a target element in
a sorted array of n elements by recursively comparing it with the middle element,
and then searching in the left or right half depending on the comparison result.
Analyzing time complexity using recurrence relations
●
●
A recurrence relation is an equation that defines a function in terms of its smaller
values.
Recurrence relations can be used to model the time complexity of divide and conquer
algorithms, as they capture how the problem size changes with each recursive call.
For example, the time complexity of merge sort can be expressed by the following
recurrence relation:
$$T(n) = 2T(n/2) + \Theta(n)$$
●
where $T(n)$ is the time required to sort an array of n elements, $2T(n/2)$ is the
time required to sort two halves of size n/2 each, and $\Theta(n)$ is the time
required to merge them together.
Master theorem for solving recurrence relations
●
The master theorem is a formula that provides asymptotic solutions for recurrence
relations of the form:
$$T(n) = aT(n/b) + f(n)$$
●
●
●
●
●
●
●
●
●
where $a \geq 1$, $b > 1$, and $f(n)$ is an asymptotically positive function.
The master theorem states that there are three cases depending on how $f(n)$
compares to $n^{\log_ba}$:
Case 1: If $f(n) = O(n^{\log_ba - \epsilon})$ for some constant $\epsilon > 0$,
then $T(n) = \Theta(n^{\log_ba})$.
Case 2: If $f(n) = \Theta(n^{\log_ba})$, then $T(n) = \Theta(n^{\log_ba} \log
n)$.
Case 3: If $f(n) = \Omega(n^{\log_ba + \epsilon})$ for some constant $\epsilon >
0$, and if $af(n/b) \leq cf(n)$ for some constant $c < 1$ and all sufficiently large n,
then $T(n) = \Theta(f(n))$.
For example, using the master theorem, we can solve the recurrence relation for
merge sort as follows:
In this case, we have $a = 2$, $b = 2$, and $f(n) = \Theta(n)$.
Comparing $f(n)$ with $n^{\log_ba} = n^{\log_22} = n$, we see that they are
equal. Therefore, we are in case 2 of the master theorem.
Applying the formula, we get $T(n) = \Theta(n^{\log_ba} \log n) = \Theta(n \log
n)$, which is the time complexity of merge sort.