Is the time complexity of the empty algorithm O(0)?

20,926

Solution 1

From Wikipedia:

A description of a function in terms of big O notation usually only provides an upper bound on the growth rate of the function.

From this description, since the empty algorithm requires 0 time to execute, it has an upper bound performance of O(0). This means, it's also O(1), which happens to be a larger upper bound.

Edit:

More formally from CLR (1ed, pg 26):

For a given function g(n), we denote O(g(n)) the set of functions

O(g(n)) = { f(n): there exist positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n) for all nn0 }

The asymptotic time performance of the empty algorithm, executing in 0 time regardless of the input, is therefore a member of O(0).

Edit 2:

We all agree that 0 is O(1). The question is, is 0 O(0) as well?

Based on the definitions, I say yes.

Furthermore, I think there's a bit more significance to the question than many answers indicate. By itself the empty algorithm is probably meaningless. However, whenever a non-trivial algorithm is specified, the empty algorithm could be thought of as lying between consecutive steps of the algorithm being specified as well as before and after the algorithm steps. It's nice to know that "nothingness" does not impact the algorithm's asymptotic time performance.

Edit 3:

Adam Crume makes the following claim:

For any function f(x), f(x) is in O(f(x)).

Proof: let S be a subset of R and T be a subset of R* (the non-negative real numbers) and let f(x):S ->T and c ≥ 1. Then 0 ≤ f(x) ≤ f(x) which leads to 0 ≤ f(x) ≤ cf(x) for all x∈S. Therefore f(x) ∈ O(f(x)).

Specifically, if f(x) = 0 then f(x) ∈ O(0).

Solution 2

It takes the same amount of time to run regardless of the input, therefore it is O(1) by definition.

Solution 3

Several answers say that the complexity is O(1) because the time is a constant and the time is bounded by the product of some coefficient and 1. Well, it is true that the time is a constant and it is bounded that way, but that doesn't mean that the best answer is O(1).

Consider an algorithm that runs in linear time. It is ordinarily designated as O(n) but let's play devil's advocate. The time is bounded by the product of some coefficient and n^2. If we consider O(n^2) to be a set, the set of all algorithms whose complexity is small enough, then linear algorithms are in that set. But it doesn't mean that the best answer is O(n^2).

The empty algorithm is in O(n^2) and in O(n) and in O(1) and in O(0). I vote for O(0).

Solution 4

I have a very simple argument for the empty algorithm being O(0): For any function f(x), f(x) is in O(f(x)). Simply let f(x)=0, and we have that 0 (the runtime of the empty algorithm) is in O(0).

On a side note, I hate it when people write f(x) = O(g(x)), when it should be f(x) ∈ O(g(x)).

Solution 5

Big O is asymptotic notation. To use big O, you need a function - in other words, the expression must be parametrized by n, even if n is not used. It makes no sense to say that the number 5 is O(n), it's the constant function f(n) = 5 that is O(n).

So, to analyze time complexity in terms of big O you need a function of n. Your algorithm always makes arguably 0 steps, but without a varying parameter talking about asymptotic behaviour makes no sense. Assume that your algorithm is parametrized by n. Only now you may use asymptotic notation. It makes no sense to say that it is O(n2), or even O(1), if you don't specify what is n (or the variable hidden in O(1))!

As soon as you settle on the number of steps, it's a matter of the definition of big O: the function f(n) = 0 is O(0).

Since this is a low-level question it depends on the model of computation. Under "idealistic" assumptions, it is possible you don't do anything. But in Python, you cannot say def f(x):, but only def f(x): pass. If you assume that every instruction, even pass (NOP), takes time, then the complexity is f(n) = c for some constant c, and unless c != 0 you can only say that f is O(1), not O(0).

It's worth noting big O by itself does not have anything to do with algorithms. For example, you may say sin x = x + O(x3) when discussing Taylor expansion. Also, O(1) does not mean constant, it means bounded by constant.

Share:
20,926
jyoungdev
Author by

jyoungdev

Updated on July 05, 2022

Comments

  • jyoungdev
    jyoungdev almost 2 years

    So given the following program:


    Is the time complexity of this program O(0)? In other words, is 0 O(0)?

    I thought answering this in a separate question would shed some light on this question.

    EDIT: Lots of good answers here! We all agree that 0 is O(1). The question is, is 0 O(0) as well?

  • Tim Goodman
    Tim Goodman almost 14 years
    I just hope no one edits wikipedia to cite this post. Potential stack overflow...
  • codekaizen
    codekaizen almost 14 years
    Wouldn't it rather be a hijacked EIP?
  • Hamish Grubijan
    Hamish Grubijan almost 14 years
    I have no problem with O(1) and O(2) being "the same", but O(0) feels a little special to me; I would tread the water carefully here.
  • Dave O.
    Dave O. almost 14 years
    @Hamish Grubijan The problem is, that we're used to the special properties of 0 regarding multiplication. In fact O(1) is really only a representative for O(c) the whole class of functions running in CONSTANT time, regardless of the actual time span. 0 is a constant timespan. Therefore O(0) = O(1) = ... = O(c) for any natural number c.
  • Dave O.
    Dave O. almost 14 years
    +1 constant time is what matters! Although Your conclusion is suboptimal, giving the impression, that O(1) != O(0), which is not the case.
  • jk.
    jk. almost 14 years
    it takes no time to run regardless of input, therefore it is O(0) by definition.
  • jyoungdev
    jyoungdev almost 14 years
    Good points. My intent with the program given is that it takes absolutely 0 time. Doesn't exist in real life except, maybe, an empty inline function that a compiler would optimize away altogether.
  • starblue
    starblue almost 14 years
    In the real world computers do have time steps, they are called clock cycles.
  • sdcvvc
    sdcvvc almost 14 years
    @Dave: O(1) is not O(0), please check the definition of big O. O(0) includes only the f(n)=0 function.
  • Greg Kuperberg
    Greg Kuperberg almost 14 years
    starblue, you're right that one CPU has clock cycles. But if you have a parallel computer with several CPUs, their clocks could be asynchronous. In any case the clock cycle time is extremely small. I'll edit the question to clarify.
  • andand
    andand almost 14 years
    You didn't answer the question. Does the empty algorithm have O(0) time complexity?
  • andand
    andand almost 14 years
    @Dave: O(0) may be a member of O(1), but the reverse is not true. Therefore O(0) != O(1).
  • Andreas Rejbrand
    Andreas Rejbrand almost 14 years
    I hate when people start a new sentence with a "Where" that should have been a part of the preveous sentence.
  • sidgeon smythe
    sidgeon smythe almost 14 years
    It is O(1), but the question is whether it is also O(0).
  • Adam Crume
    Adam Crume almost 14 years
    O(O(0)) is not even a meaningful construct. O(0) is a set of functions, not a function.
  • Michael Foukarakis
    Michael Foukarakis almost 14 years
    I think we all know that 'is' == '=' == '∈', in this context. They're used interchangeably since as long as I can recall.
  • Adam Crume
    Adam Crume almost 14 years
    Nothing personal, but "it's always been this way" is one of my least favorite arguments. (I heard it all the time at a previous employer using 70s era technology, but I digress.) I'm a mathematician at heart, and correctness is what matters.
  • andand
    andand almost 14 years
    The OP noted that nobody disputes that the empty algorithm is O(1). He wants to know whether it is also O(0).
  • Tim Goodman
    Tim Goodman almost 14 years
    The examples are for illustration purposes only, not intended as a formal proof of anything. Technically you could say that something O(n) is also O(5n) and still meet the strict definition, but the convention is to say O(n). I think it's better to be consistent.
  • sidgeon smythe
    sidgeon smythe almost 14 years
    Anything that's O(5n) is also O(n) and vice versa, but not everything that's O(1) is O(0) — only the zero function is. So that convention isn't perfectly relevant here.
  • sidgeon smythe
    sidgeon smythe almost 14 years
    I agree with you. But interestingly, although formally O(g(x)) is defined as a class of functions (so subset notation would be correct), informally everyone thinks of O(g(x)) not as a class but as meaning "some function such that…". And the "=" is not a symmetric equality sign, but just means "is" (Knuth explicitly points this out, along with "Aristotle is a man, but a man isn't necessarily Aristotle"). And when used in analysis, in expressions like e^x = 1 + x + O(x^2) (near 0), the addition is defined as a set again, but the informal perspective shines through. :-)
  • jyoungdev
    jyoungdev almost 14 years
    Using = for f(x) = O(g(x)) caused me much unnecessary confusion in college, especially since I don't remember any instructors or textbooks ever letting us know what = means in this context.
  • andand
    andand almost 14 years
    Except that the convention applies when there is a strictly positive time required to perform the function. That's not the case when absolutely no time is required to for the algorithm, as is the case when there's nothing to do (i.e. the empty algorithm).
  • Mau
    Mau almost 14 years
    The fact is, there can't be an empty program (though there can be an empty algorithm).
  • Stewart
    Stewart almost 14 years
    @andand O(0) is a proper subset, not a member, of O(1).
  • Stewart
    Stewart almost 14 years
    You could equally argue that the empty algorithm grows like 0n^2, and therefore you should say it's O(n^2). Which is true, but equally beside the point.
  • andand
    andand almost 14 years
    @Stewart: Yes. I should have been more careful. 0 is a member of O(1) while O(0) is a subset of O(1).
  • Yinda Yin
    Yinda Yin almost 12 years
    O(0) is not meaningful for ordinary use, since the only algorithm that can fulfill it is the empty algorithm.
  • Stewart
    Stewart almost 12 years
    You could make your algorithm a function that takes a variable-length array as a dummy parameter. If you're talking about an entire program being O(0), you could consider it to be parameterised by its command line arguments.
  • andand
    andand over 11 years
    @RobertHarvey since this question is about the empty algorithm, it has meaning for this context.
  • Yinda Yin
    Yinda Yin over 11 years
    @andand It's just mental masturbation, that's all I'm sayin'.
  • Steve Zelaznik
    Steve Zelaznik over 8 years
    Unless somebody discovers a way to transmit information faster than the speed of light, O(1) will be the lower boundary of any operation.
  • TWiStErRob
    TWiStErRob over 8 years
    I agree with your point on NOP even a C void f(void) {} takes time because it has an implicit return and it has to be called (= stack manipulation takes place to push return address and jump to code) which also takes time. @sdcvvc Do we consider this stack bookkeeping part of the time to execute an algorithm? I think so. At the same time if we consider a truly empty algorithm (not even an assembly level NOP, i.e. no instructions at all, like the OP asked) that's O(0).
  • BD107
    BD107 almost 4 years
    Side note: unwinding definitions, $f(x) \in O(0)$ if and only if there exists some X for which $f(x) \leq 0$ so long as $x \geq X$. This is a perfectly reasonable definition.