Big O for worst-case running time and Ω is for the best-case, but why is Ω used in worst case sometimes?

13,036

It is important to distinguish between the case and the bound.

Best, average, and worst are common cases of interest when analyzing algorithms.

Upper (O, o) and lower (Omega, omega), along with Theta, are common bounds on functions.

When we say "Algorithm X's worst-case time complexity is O(n)", we're saying that the function which represents Algorithm X's performance, when we restrict inputs to worst-case inputs, is asymptotically bounded from above by some linear function. You could speak of a lower bound on the worst-case input; or an upper or lower bound on the average, or best, case behavior.

Case != Bound. That said, "upper on the worst" and "lower on the best" are pretty sensible sorts of metrics... they provide absolute bounds on the performance of an algorithm. It doesn't mean we can't talk about other metrics.

Edit to respond to your updated question:

The question asks you to show that Omega(lg n) is a lower bound on the worst case behavior. In other words, when this algorithm does as much work as it can do for a class of inputs, the amount of work it does grows at least as fast as (lg n), asymptotically. So your steps are the following: (1) identify the worst case for the algorithm; (2) find a lower bound for the runtime of the algorithm on inputs belonging to the worst case.

Here's an illustration of the way this would look for linear search:

In the worst case of linear search, the target item is not in the list, and all items in the list must be examined to determine this. Therefore, a lower bound on the worst-case complexity of this algorithm is O(n).

Important to note: for lots of algorithms, the complexity for most cases will be bounded from above and below by a common set of functions. It's very common for the Theta bound to apply. So it might very well be the case that you won't get a different answer for Omega than you do for O, in any event.

Share:
13,036
jantristanmilan
Author by

jantristanmilan

Updated on June 14, 2022

Comments

  • jantristanmilan
    jantristanmilan almost 2 years

    I'm confused, I thought that you use Big O for worst-case running time and Ω is for the best-case? Can someone please explain?

    And isn't (lg n) the best-case? and (nlg n) is the worst case? Or am I misunderstanding something?

    Show that the worst-case running time of Max-Heapify on a heap of size n is Ω(lg n). ( Hint: For a heap with n nodes, give node values that cause Max-Heapify to be called recursively at every node on a path from the root down to a leaf.)

    Edit: no this is not homework. im practicing and this has an answer key buy im confused. http://www-scf.usc.edu/~csci303/cs303hw4solutions.pdf Problem 4(6.2 - 6)

    Edit 2: So I misunderstood the question not about Big O and Ω?

  • Zeta
    Zeta about 11 years
    "is worse than lg(n)" - maybe you should say "is (asymptotic) greater than".
  • jantristanmilan
    jantristanmilan about 11 years
    So I misunderstood the question not about Big O and Ω?
  • jantristanmilan
    jantristanmilan about 11 years
    @Zeta in our exam you have to say "worse" cause if its greater it's worse right?
  • Montre
    Montre about 11 years
    I was under the impression that the sensible metric was "upper bound on the average case", except the average case tends to be more difficult to determine than the worst case. (Consider quicksort, where the worst case is pathological.)
  • Patrick87
    Patrick87 about 11 years
    @millimoose I would rather say that the sensible bound is the upper bound on the worst case, and that Quicksort is the exception (in that we prefer to ignore the pathological worst case in order to justify using it over a true O(n log n) worst-case algorithm). That said, I have no references or credentials to back up the claim that "upper bound on worst case" is more "meaningful" than "upper bound on average case", so I can clarify that this is my assessment, not generally accepted.
  • Montre
    Montre about 11 years
    In any given program, are you constantly getting worst-case input? Usually not, so average-case complexity will be what determines your actual observed performance, for any algorithm, pathological or not. However, a corollary of this is that algorithms with a pathological worst-case complexity can lead to DOS exploits, like with the recent-ish Java parseDouble() bug. Another would be that it makes sense to use algorithms appropriate for the sort of input data you have, which is why timsort is awesome.
  • Patrick87
    Patrick87 about 11 years
    @millimoose In general, I'd agree that the metric on which one bases the choice of an algorithm should depend on the circumstances. For instance, some situations might call for an algorithm with good average-case performance and tolerate occasional bad performance, while others might call for more consistently scalable performance. In theoretical contexts, upper/worst and lower/best are more important because they say things about where problems live in complexity classes.
  • tmakino
    tmakino over 6 years
    @Patrick87 is "Therefore, a lower bound on the worst-case complexity of this algorithm is O(n)." a typo? Shouldn't it be: "Therefore, a lower bound on the worst-case complexity of this algorithm is Omega(n)?".