Welcome to this lecture on divide and conquer algorithms. In this lecture we are going to be talking about algorithm design strategy. What is an algorithm design strategy? In algorithm design strategy is a strategy that's common to many different algorithms. It's something to consider when you are looking at a new problem which hasn't been considered before, for which there are no efficient algorithms. Let's talk about divide and conquer. As a strategy, you must have seen divide and conquer algorithms in the past. What's the strategy look like? To motivate this, let's look at an algorithm we have already encountered, and that algorithm is Merge Sort. We encountered this in the very first weeks of this specialization. What is the purpose of the Merge Sort algorithm? Well, the purpose of Merge Sort is to sort an array in ascending order, let's say, and let's say it's an array of numbers. Given an array of n numbers, the way Merge Sort worked was, here is an array of n numbers. You split this array into two parts, you split it into two parts. Each part, let's assume n is even, so it's two equal parts, then each part is n over 2. Then you sorted each of them recursively and here's where the recursive magic happens. Well, which means what? Which means that you are recursively calling Merge Sort back on arrays of size n by 2, which itself would be greatly called Merge Sort back on arrays of size n by 4 until you hit a base case. The base case happens when n is really small, like one or two. We've been through all of this and you know how this process works. I'm not going to really get much more deeper into this since we have already studied this in the past. Then comes the final output. Here's the sorted output for the first half of the array. Here's the sorted output for the second half of the array, and then we combine. The combined step is the merge step. In this case, it's the merge step. When we combine, we get the sorted output for the original problem. For the original problem, we get this sorted output. But as each of these recursive subproblems get solved by a recursive call to Merge Sort. This pattern keeps recurring a lot and you will see this pattern again and again and again this week. You would see this pattern, for example, to come up with some simple algorithms for solving something called a max subarray problem. We'll look at the same pattern for multiplying two numbers, we'll come up with an efficient algorithm for multiplying two numbers. We'll look at something called Fast Fourier transform. We will see the same pattern recurring. What is this pattern? This pattern is to take your original input and divide it into some number of subproblems. Solve each of these subproblems recursively. Get the outputs of each of the subproblems and combine these outputs. That will give you the output of your original problem and output of your original problem. This is going to be the general strategy for divide and conquer. Now there are different kinds of divide and conquer algorithm. Typically the work is done typically in dividing. The division step may need some work. Sometimes the division step is trivial. Sometimes the combining step may need work, sometimes both steps may need work. As an example, in Merge Sort, going back to this, the division step is actually trivial Theta 1 because you are simply just dividing an array right across the middle and you're seeing just 1.5 and another half. That's just a very simple strategy, and that's just Theta 1, constant number of operations. The work is done here in combining. When you combine, you have to do the merge algorithm, which if you remember, if you recall, is Theta of n. It's as much as the size of the array that you are merging. So that's the Theta of n. We saw that the overall complexity of this whole thing for an input of size n was n log n. We are going to see more details on this today. We're going to see more details. Now, let me contrast it with a different divide and conquer algorithm. We also studied this, and this is the Quicksort algorithm. Quicksort has the same structure. Similar structure, not the same. So given an array, we use partitioning using something called the pivot element. So you choose a pivot element. You partition according to the pivot element. If you choose the pivot element at random, or if you have a clever choice, we did not go through this, but there are some clever ways of choosing the pivot called the median of medians. Using this trick, you can actually choose a pivot that's known to be guaranteed to give you a reasonable subdivision. I'm not going to go deeper into that. You can randomly choose and then with high probability, you're going to get good behavior. But in any case, partition divides your inputs into two parts. Here's where we do the work, Theta of n work. Then we recurse. When we come back with the sorted output, the combination in this case is trivial. Combined is just Theta. That's Quicksort. Let me give you a third example. We've already seen this Binary Search, which is another divide and conquer. In Binary Search you have an array and you're searching for a target element in this array. What you do is based on how the middle element compares to the target, you narrow down your search into, if your original array is n, let's say into n by 2, at most n by 2. Narrow down. You do some work here, you do a bunch of comparisons, but it's still Theta 1. Narrow down your work and your reduced to this problem which is solved recursively. Even though it doesn't fit this pattern, you see it's still an instance of a divide and conquer algorithm. In this module, we are going to study more and more divide and conquer algorithms and we're going to see how to analyze divide and conquer algorithm using something called the Master Method, which we are going to revisit again. That's going to be the agenda for this module and I hope to see you in the next upcoming lectures. Thank you. See you soon. Bye.