From what I've gathered, big O notation is used to describe the length of an operation. However, that's as far as I've gotten. What exactly is big O notation and how do the most common ones (O(n)
, O(2n)
, O(n^2)
, O(log n)
, etc)?

- 1,440
- 1
- 15
- 28
-
3How exactly is google failing you? I mean what can we provide that is not already supplied better elsewhere? – Telastyn Jul 05 '13 at 03:28
-
1@Telastyn I don't really understand the subject matter. It's like trying to learn something with no prior knowledge. All the sites you visit use jargon you don't understand. You need school to lead you in somewhat. That's what's happening here. I don't know anything regarding algorithms. Another thing is I don't know what the base of the logarithm would be. – Cole Tobin Jul 05 '13 at 03:53
-
Here is a pretty good resource for beginners http://rob-bell.net/2009/06/a-beginners-guide-to-big-o-notation/ – Rayshawn Jul 05 '13 at 04:08
-
@MichaelT that in itself is a duplicate *facepalm* – Cole Tobin Jul 05 '13 at 04:12
-
1Your best bet is a good CS textbook. Also, big O notation is about estimates. The base of the logarithm in O(log n) doesn't really matter. (Though it is often 2.) This graph from a comment in @RayEatmon's link is a nice visual representation: http://science.slc.edu/~jmarshall/courses/2002/spring/cs50/BigO/index.html – Gort the Robot Jul 05 '13 at 04:21
-
1The base does not matter _at all_. O(log2 N) is _exactly_ the same as O(log4 N) because O(f(x)) is the same as O(2*f(x)), and log2 N is 2 * log4 N. Big-O notation tells you about the relative complexity as N increases. – MSalters Jul 05 '13 at 07:28
1 Answers
I can understand where you are coming from here - I am learning this as well, and sometimes it is really helpful to have a more relaxed explanation of something when it is new. Please keep in mind that I am only just learning this as well, but this is how I understand big-Oh:
It is part of asymptotic notation, which is used to describe the running time of an algorithm (i.e. it puts the running time into a range from best-case (Omega) to worst-case (big-Oh) and we can be sure that the algorithm will take at least Omega, and at most Big-Oh to run)
Big-Oh is the upper-bound, i.e. the slowest the algorithm will ever (successfully) run
O(n) is linear time - You can think of it as walking up a set of stairs, left-right-left-right. So you take one step on each stair (or loop through a list once)
O(n^2) is quadratic time - obviously this is much higher (will take longer) than O(n). You could think of this as taking n steps on each stair as you climb (or if there are 8 stairs, you would take 8 steps on each stair)
O(log(n)) is logarithmic time - this is the holy grail for many algorithms (like sorting; being able to sort in O(log(n)) is really good!). Trying to describe this in the stair analogy gets a little tricky, and I don't think I quite understand it well enough to try :) But, the general idea is that as the stairs get longer and longer, you don't have to do much much more work to climb them, because you are actually jumping up them
As to what is the base of the logarithm, it doesn't really matter for asymptotic notation, because with asymptotic notation you are trying to describe what will happen when the size (n) gets really really big (though it is often base 2). All log graphs (base >= 2) start to flatten out at high values, so they are always going to win at the top end.
There is a course on coursera, Design and Analysis of Algorithms, that has just started this week - one of the first lectures gives a high-level description of asymptotic notation, and the last question of week one's problem set made me think about running times and what beats what. It wouldn't take long to cast your eye over these.
Hope this helps a little!

- 306
- 2
- 4
-
1I took [the Design and Analysis of Algorithms Part 1](https://www.coursera.org/course/algo) course from coursera the first time it was run last year (2012) and recommend it 100%. – rishimaharaj Jul 05 '13 at 04:57
-
Asymptotic notation is not only used to characterise running time, but also other quantities, such as amount of memory (storage). – Bart van Ingen Schenau Jul 05 '13 at 06:56
-
1The staircase analogy for `O(log n)` is that, on each step, you take half the remaining stairs of the staircase. So, if there are 8 stairs, you take 4 of them in your first step, 2 in the second, etc. – Bart van Ingen Schenau Jul 05 '13 at 07:00
-
1A common big-Oh bound you missed is **O(1)**: constant time - No matter how many stairs your staircase has, you jump up it in one big leap. – Bart van Ingen Schenau Jul 05 '13 at 07:02
-
another small point is that you can look at the complexity in terms of space as well as time – jk. Jul 05 '13 at 08:30
-