A common problem archetype is the following: “given tasks, order them in some optimal way” or “find an order of tasks satisfying a certain condition”. I will outline some common methods to solve these problems below.
Total Orders
In many cases, it will be possible to define a total order on the tasks; that is, it’s possible to assign each task a “priority” such that it’s always optimal to do high-priority tasks before low-priority ones.
An example of a non-total order is rock-paper-scissors, because the three moves can’t be placed into a total order (e.g. rock > paper > scissors) due to their cyclic nature.
Tasks and Deadlines
(CSES) You have to process tasks. Each task has a duration and a deadline, and you will process the tasks in some order one after another. Your reward for a task is where is its deadline and is your finishing time. (The starting time is , and you have to process all tasks even if a task would yield negative reward.)
The first sneaky observation is that the term doesn’t actually matter, since we can rewrite the total reward as , and note that since we have to complete all the tasks, is constant over all possible orderings. Therefore, we can reduce this problem to minimizing instead.
As it turns out, if we want to minimizing this quantity, we should always do tasks in order of low to high duration! There are several ways to show this, but one particularly nice way is the exchange argument.
Exchange Argument
This idea is an example of local analysis. Here’s the idea:
Suppose we’ve found an optimal ordering. Then, swapping any two adjacent elements in this ordering certainly cannot be better, otherwise our assumption of optimality would be contradicted. This simple, local condition can often help us greatly reduce the complexity of ordering problems.
Let’s apply this idea to the CSES problem above. Say we’ve ordered the tasks in a way such that the th task has duration and is completed at time . Then, how does our exchange condition constrain the relationship between and ?
Consider swapping the 1st and 2nd tasks. Then, the first observation we should make is that will all remain unchanged. Therefore, we only need to consider how changes! This is why considering adjacent swaps in particular is so useful: in many cases, it will leave the suffix unchanged, leading to a much simpler analysis.
Going back to our problem, if our order is optimal, then the following must hold:
If we repeat this argument for other indices, we can indeed show that in an optimal order, for all , as desired.
Note that in this problem, this condition ends up being both necessary and sufficient: since there’s only one way for this condition to be satisfied, there’s no need to worry about finding a suboptimal local minimum. However, this blog has some educational examples of situations in which we need to be slightly more careful. In particular, it considers the following problem:
Task Scheduling
There are 𝑛 jobs, and each job consists of two parts. The first part must be done in center A, the second part must be done in center B, each center can do at most one job in one time, and the second part of the 𝑖-th job can be started only if the first part of the 𝑖-th job is already done. Find the minimum amount of time required to complete all the tasks assuming you can arrange them in any order.
An illustration of the scheduling procedure described above. The two parts of each job are assigned the same color. Notice how the second red part can only begin once the first red part has finished.
If we try applying the same exchange argument as above, we’ll end up with the condition that
for all . The exchange argument guarantees that this condition is necessary, but it ultimately turns out not to be sufficient (for more details, consult the blog). In order for this comparator to work, we have to make it slightly less “relaxed” by tie-breaking with .
Johnson’s Rule
However, I’d like to present a cleaner line of thinking. Actually, at first glance, it’s not at all obvious that a total order will even exist. Couldn’t it be possible that and , but (here, means it’s better to perform task before task )?
Let’s change our perspective and instead analyze , the amount of time for which only center is operating. Then, the total time taken, , will simply be , and since is constant, we can turn our attention to minimizing instead.
Now, there turns out to be a very nice way of visualizing . Let me illustrate with an example where , , , :
is just the maximum height reached! Algebraically:
Take a moment to make sure you understand why.
Hopefully, from this perspective, it’s slightly clearer why a total order emerges. The first observation we should make is that we should put all tasks with before tasks with : visually, that means we want to build as deep a valley as possible before climbing back up.
So, we can now consider the subproblem where holds for all . In this case, how should we arrange our tasks?
Solution
It turns out that it is always optimal to sort in increasing order of .
Now, how should we arrange tasks when ?
Solution
By symmetry, we should sort in decreasing order of .
Putting it all together, we’ve just discovered Johnson’s rule!
Problems
The next two problems involve using total orders to reduce the task of finding an ordered subset to finding an unordered subset instead.