How to determine memory and time complexity of an algorithm?

18,961

Determining time and memory complexities amounts to counting how much of these two resources are used when running the algorithm, and seeing how these amounts change as the input size (k in this case) changes.

Time is going to be determined by how many times each of the instructions are evaluated, and space will be determined by how large the data structures involved need to get to compute the solution.

In this particular scenario, you're looking at a recursive algorithm, so basically this involves counting 1) how many recursive calls are made and 2) how much work is done for each of these calls.

Since the input is halved with each call, the sequence of calls will look something like this:

sample(k)       )
 sample(k/2)    )
  sample(k/4)   ) 
     ...        ) - n = number of calls 
    sample(4)   )
     sample(2)  )
      sample(1) )

Halving at each recursive call in this way will lead to a logarithmic number of calls.

n = log(k)

At each call we are storing a constant number of variables in the call stack, and doing constant amount of work (operations). This stems from the fact that the number of variables and the number of comparisons/additions/divisions in each call does not get bigger with bigger k.

The total time complexity is the number of calls multiplied by the amount of work done in each call, thus

time complexity = A*log(k) + B

For some constants A and B which reflect the actual time cost of doing a recursive call and doing comparisons/divisions respectively. Similarly:

space complexity = C*log(k) + D

For suitable constants C and D for space cost of recursion and variable storage respectively.

Now in this kind of analysis we care mostly about the asymptotic complexity, that is we don't really care about the constants because they reflect details about the machine which is running the algorithm, and we really want to know the shape of the curve (as k gets bigger). If you follow the rules of writing complexity using Big-Oh notation you'll arrive at the result:

space complexity = time complexity = O(log(k))

Edit: Memory complexity details

As I said before, memory complexity is determined by how large the data structures need to get to compute a solution, so you may ask: there are no data structures being used in this function, so where is the log(k) memory going?

The short answer: You have to store log(k) different values of the parameter k, one for each recursive call.

The detailed answer: there is an implicit data structure being used here by the mechanism of function calling (which we exploit via recursion) and its name is the call stack. Each time sample(k) is called, a new stack frame is created and a number of values are pushed onto the stack: the local value of the parameter k, the return address, and other implementation dependent things. In this way each recursive call forms a 'layer' on the stack where its local information is stored. The whole picture ends up looking something like this:

----------- < top of stack
|  k = 1  |
|   ...   | < stack frame for sample(1)
|---------|
|  k = 2  |
|   ...   | < stack frame for sample(2)
|---------|

    ...

|---------|
| k = p/2 |
|   ...   | < stack frame for sample(p/2)
|---------|
|  k = p  |
|   ...   | < stack frame for sample(p)
|---------|
|         | < stack frame for main() or whatever 
              initially called sample(p) 
              (we don't count this one)

(I've distinguished here the initial parameter value p from the value of k at each recursive call to avoid confusion, hopefully)

Main thing to note is, as there are n = log(k) recursive calls, there are n such stack frames. Each stack frame has constant size, and thus the space complexity is O(log(k)).

Share:
18,961

Related videos on Youtube

user3138212
Author by

user3138212

Updated on September 10, 2022

Comments

  • user3138212
    user3138212 over 1 year

    I am not good at determining time and memory complexities and would appreciate it if someone could help me out.

    I have an algorithm, here and I am not sure what its time and memory complexities would be.

    Function sample(k)
       IF k < 2
           Return 0
       Return 1 + sample(k/2)
    

    What is its time and memory complexity and why?

    Thanks

  • RichardPlunkett
    RichardPlunkett over 10 years
    note that with the style of tail recursion present, the space part could be easily eliminated, leaving O(logk) time and O(1) space.
  • user3138212
    user3138212 over 10 years
    @EzequielMuns Thanks for your answer. I'm still not quite sure why the space complexity is log(k). I would appreciate it if you could explain further.
  • Ezequiel Muns
    Ezequiel Muns over 10 years
    @user3138212 See extended answer :)