0

I was just reading answers to a question Plain English explanation of Big O From that i came to know that Big-O notation is just an "upper bound" of the complexity of an algorithm?

But can we apply it to other cases(i.e best and average case) of an algorithm?

user3902660
  • 35
  • 1
  • 4
  • 2
    Best case is omega. There is no such thing as average case. – Ordous Jan 08 '15 at 20:42
  • 1
    possible duplicate of [Please explain the statement that the function an+b belongs to O(n^2) and Θ(n)?](http://programmers.stackexchange.com/questions/158569/please-explain-the-statement-that-the-function-anb-belongs-to-on2-and-and-%ce%98) – gnat Jan 08 '15 at 20:52
  • @gnat I don't think so. The linked question is about functions (which are similar, yet different from algorithms), and explore different metrics on them (omega, theta and oh). While this question asks why is the worst case for algorithms is chosen (because it's the most informative, close second is best case). – Ordous Jan 08 '15 at 20:58
  • Useful: http://www.programmerinterview.com/index.php/data-structures/big-o-versus-big-omega-notations/ – Bobson Jan 08 '15 at 21:44
  • No. You will hear statement like "the worst case performance of the algorithm is O (xxx)", but you can make statements like "the average performance is O (yyy)" and "the best case performance is O *zzz)". – gnasher729 Apr 20 '16 at 07:51

3 Answers3

2

Yes, you can apply it to other cases.

Big-Oh basically says "no matter how you stack the deck against this algorithm, at worst its performance will scale this way compared to the input."

Omega is similar, but means "no matter how good you make its inputs, at best its performance cannot scale any better than this compared to the input."

For example, quicksort is a popular sorting algorithm. It is actually O(n2) because at worst it has quadratic performance (bad pivot choice). At best it is Ω(n log2 n) which is actually the best that any general-purpose sorting algorithm can possibly achieve.

  • General-purpose comparison-based algorithm. Things may be different if you don't have to do comparisons to sort. – Vatine Apr 20 '16 at 15:54
1

Big-Oh describes the growth rate of a set of functions by comparing it to the growth rate of another function. What those functions mean is totally irrelevant to Big-Oh. It could be a function describing the worst-case time complexity of an algorithm. It could be a function describing the best-case time complexity of an algorithm. It could be a function describing the average case time complexity of an algorithm. It could be a function describing the amortized worst-case time complexity of an algorithm. It could be a function describing the worst-case step complexity of an algorithm. It could be a function describing the worst-case space complexity of an algorithm. It could be a function describing the amount of humans in the world. It could be a function describing the amount of money a movie makes in relation to its production cost.

Jörg W Mittag
  • 101,921
  • 24
  • 218
  • 318
  • Completely agree, describing asymptotic running time and categorizing running time are two different things. – CodeYogi Apr 20 '16 at 04:48
0

Sure the three main notations are:

"Big-Oh" meaning your function is always <= c*f(x) for some constant c and values greater than some x.

"Big-Omega" meaning your function is always >= c*f(x) for some constant c and values greater than some x

"Big-Theta" which is used to describe a tight upper and lower bound, meaning it is both Big-Oh and Big-Omega. To show Big-Theta you should show both Big-Oh and Big-Omega are the same.

And then on top of all this sometimes you will hear about the "worst case of the best case" or "best case of the worst case" etc.

And to the person above who talks about quick sort, what about smooth sort?

JBurk94
  • 29
  • 3