Algorithms in O(n^2) vs O(n)

31,466

Solution 1

What you are asking about is a topic in computer science known as Algorithm Complexity Analysis. It is a very important topic to consider when writing algorithms, or solutions, in your programs because it relates to run-time, or how fast your computations will run.

Big-Oh, or O(n) relates to the upper-bound run-time for an algorithm to run. In this case, O(n) means for n-elements there will need to be all n-elements considered for the algorithms computation to complete, or linear. The range for this Big-Oh complexity is from constant time, O(1), to up O(n^n) for really large, and very slow computations. Also, consider equations such as the following:

y=10n+1
y=5n+10

These both are O(n) complexity because as the number of elements grow, the equation grows larger and larger because of it. We neglect the constants because the equation will grow larger and fast due to the variable, rather than due to the constant values which never change. Whereas, with equation as the following:

y=10n^2+5n+5 

The complexity is going to be O(n^2) due to the 10n^2 being the largest growing element contributing to the equation growing faster. We drop the constants and consider the largest growing components to equations to evaluate complexity.

For Big-Omega complexity, we consider this the lower bound for Algorithm Complexity Analysis. For example, an algorithm can run as quick as Omega(n) (best case) but take as long as O(n^2) (worst case) this is common in analysis of sorting algorithms or searching algorithms.

In some cases, we want to write programs with algorithms that are efficient and faster for optimization reasons, especially when we want a quicker program for faster solutions or quicker run times.

The code example you provided is O(n) because it uses a for loop to iterate over n-elements. Consider a double for loop, where there is a second loop within your current for loop. This is O(n^2) due to iterating over, in the worst case, n*n elements.

Java pseudo-code for a O(n^2) run-time initializing an empty matrix:

int result[n][m];
for(int i=0; i<n; ++i){
    for(int j=0; j<m; ++j){
       result[i][j] = 0;
    }
}

Notice, it uses two loops therefore inducing an O(n^2) run-time.

Here is a graph to show how equations grow over time: (and how fast they grow) Graph

Solution 2

O(n) denotes the number of iterations, calculations or steps needed at most (worst case), for the algorithm to reach its end-state, n being the objects given at the start of the algorithm.

Suppose you've got an array of 5 elements, and a sorting algorithm that has O(n^2) complexity. You know that if you apply the sort on the array, it will need at most 5^2=25 steps to reach its end-state.

Also read: What is the difference between O, Ω, and Θ?

Share:
31,466
Admin
Author by

Admin

Updated on January 03, 2020

Comments

  • Admin
    Admin about 4 years

    I'm new to computer science and just started with pseudocodes and I have some questions. It's my third week in the semester and majority is self-studying. I have some questions:

    What is the difference of an O(n^2) with an O(n) algorithm? - Similarly, what is O(n log n)? - and Ω(n^2)?

    So far, I have written:

    horner = 0;
    for( i = n; i >= 0; i −− )
        horner = x * horner + a[i];
    

    But found out it is O(n). How do I transform it?

    What is the run time? - I know assignment on first line is 1 operation

    And how does it look like in an actual, say C#, algorithm?

  • Dimitris Sfounis
    Dimitris Sfounis about 9 years
    I made so many mistakes and edits writing that, that I think I need an O(n^3) notation for my commentwriting as well.
  • T.C.
    T.C. about 9 years
    "it will need at most 5^2=25 steps to reach its end-state" No. It can take 3n^2, 10n^2, 1000000n^2, or n^2+10^100 steps, and it's still O(n^2).
  • T.C.
    T.C. about 9 years
    O/Θ/Ω has nothing to do with best/average/worst case. Completely different concepts.
  • Lucas Crawford
    Lucas Crawford about 9 years
    You are right, I made an edit to exchange the usage of best/average/worse case for upper bound and lower bound. Sorry about this confusion.
  • mip
    mip about 9 years
    @T.C. good point. Even n + log n + n log n + n^2 counts as O(n^2).
  • Ray
    Ray almost 7 years
    Big O notation is about the order of growth. If you run the algorithm with 10 elements, it will take some time. When double the number, how much longer will it take? Double the time means it is O(n), Quadruple the time means it is O(n^2). You can look into the "Master theorem" for more information on how to derive the the big O notation from time measures.
  • Juan Sebastian Lozano
    Juan Sebastian Lozano over 4 years
    @Ray Yes, but it's about the asymptotic order of growth - so if you double 10 it might very well be more than quadruple the time, but that ratio approaches 4 as the original quantity gets bigger and bigger.