Asymptotic Upper Bounds

  • Appreciate that Big-Oh describes an upper bound on the running time of a program.

Recall this exercise from the last chapter:

Your intern is told to survey the literature for potential solutions to this computational task. Upon studying the literature, the intern finds three algorithms that solve the problem at hand. The algorithms have the following runtimes: $\Omicron(\lg n)$, $\Omicron(2^n)$, and $\Omicron(n)$. Which algorithm will you choose to be implemented?

Relying only on the mathematical notion of Big-Oh, these three algorithms might all take a logarithmic amount of time to run because $\lg n \in \Omicron(\lg n)$, $\lg n \in \Omicron(n)$, and $\lg n \in \Omicron(2^n)$.

It would be mathematically justified, but very strange (and unhelpful), for the literature which the intern studied to have said that three different logarithmic-time algorithms were in $\Omicron(\lg n)$, $\Omicron(n)$, and $\Omicron(2^n)$, respectively.

Case in point: When we use Big-Oh to communicate how fast an algorithm is, we give the smallest Big-Oh time (tightest upper bound) we can prove to be true.

Here is another example that uses Big-Oh to describe upper bounds!