2

Given for example this code:

for(int i=1;i<n;i++)
        for(int j=1; j < sqrt(i); j++)
                foo(); //foo takes constant time

can someone explain to me how to calculate the computational complexity ("Big O") of this kind of program?

Doc Brown
  • 199,015
  • 33
  • 367
  • 565
idan di
  • 157
  • 1
  • 4
  • No. I see people like to downvote for nothing. I explained that I red articles about it. and for this specific problem I need help . – idan di Apr 17 '16 at 13:17
  • 1
    While I agree gnat is completely off-base here, I can't understand the question well enough to answer it myself. "runs until sqrt(1)" sounds like "runs exactly once" to me, and I can't tell what "runs until srt(n)" would mean other than running the sqrt() function n times, which is clearly not what you're after. Also, you only appear to be asking for a summation formula, which by itself has nothing to do with big-O and is probably a better fit for Math.SE. It might help if you posted actual pseudocode of the algorithm you're trying to calculate a big-O for. – Ixrec Apr 17 '16 at 13:31
  • 2
    I'm voting to close this question as off topic because it's not about *software development concepts*. Please look at the [help/on-topic] to understand what kinds of questions are acceptable here. While you encountered this problem while trying to calculate Big-O (which is on topic here), your problem is purely mathematical. It may be better to ask the question about the sum of square roots on [math.se] instead, but make sure to show what you've tried. – amon Apr 17 '16 at 13:33
  • @lxrec , I added a simple code – idan di Apr 17 '16 at 13:42
  • @idandi I see, so what you're actually after is a summation formula for `floor(sqrt(1)) + floor(sqrt(2)) + ... + floor(sqrt(n))`. That's definitely a better fit for Math.SE. If you make a post over there, be sure to delete this one. – Ixrec Apr 17 '16 at 13:46
  • http://mathforum.org/library/drmath/view/65309.html – axel22 Jan 28 '17 at 01:25

2 Answers2

16

This is basically a math question, it might be better on "Computer Science" stackexchange or even SO. But I guess you can expect programmers to know it also, and Programmers is not only for "Soft" questions so maybe it is well placed here.

Anyways.

Based on the answers so far, I believe that Phillip Kendall has misread your question. He thinks you are asking "what is the computational complexity of evaluating this summation". You are asking. "What is the growth rate of this summation".

There are several techniques to estimate growth rates of series. A quite general one is the integral method, which basically turns it into a calculus problem. The idea is that taking a series, which is taking values according to some analytic function like f(n) = n or f(n) = n^{0.5}, is the same as, taking the integral of a "step" function over the (continuous) range from 1 to n. Because the step function is bounded from above and below by the smooth function, and the smooth function is easy to integrate (or at least, when it is easy to integrate), this gives a clean answer.

For instance, you know that the indefinite integral of x dx is 1/2 x^2, so the sum 1 + 2 + ... + n is O(n^2).

In your case, you care about sum up to sqrt(n). The indefinite integral of sqrt(x) dx is 2/3 x^{3/2}. So the sum 1 + sqrt(2) + ... + sqrt(n) is O(n^{3/2}).

However, many people forgot their calculus. In many cases you can get away with simple heuristics to estimate these things.

For instance, in the sum 1 + sqrt(2) + ... + sqrt(n), you know that sqrt(n) is the largest number in the sum. And there are n numbers in the sum. So easily, you can see that the sum can't exceed n * sqrt(n).

Now, you'd like to see that the sum can't be much less. It's harder to see that because, the entries at the beginning are small. However, what is the value of a typical entry. Let's consider all the numbers in the second half. The last of those is sqrt(n), and the middle entry is sqrt(n/2). But sqrt(n/2) can also be rewritten sqrt(n) / sqrt(2). In other words, all the entries in the second half of the sequence are within a factor sqrt(2) of eachother, so as far as Big O is concerned they are "the same".

Concretely, all numbers in the second half of the sum are at least sqrt(n) / sqrt(2). And there are n/2 of those numbers. So the sum is at least n * sqrt(n) / (2 sqrt(2)), and that function is also O(n^{3/2}).

Thus we know, without calculus, that the growth rate is within a constant factor of n^{3/2}.


I like this way of approaching the problem because it promotes good engineering thinking. When a problem is hard to solve directly, it's good to focus on the typical case. Often times if you make something that works well typically, that is good enough in practice. It's also true in Science. Sometimes, the exact constant in some formula matters a lot and you need to use powerful tools like calculus to find it. But more commonly, people only care about the big picture and calculus may obscure that with a focus on closed form solutions and the exact constants. Many times, if you can understand the typical case, that gives you enough information to solve the overall problem.

Chris Beck
  • 553
  • 5
  • 14
  • I wish I could upvote , many thanks , excactly what I needed . Thanks ! – idan di Apr 17 '16 at 13:54
  • 1
    @idandi Since this is an answer to your question, if you think it answers your question well, you should be able to acknowledge the answer with the green tick even if you can't upvote the answer. – Lawrence Apr 17 '16 at 15:16
0

The complexity is quite obviously the sum of sqrt (i), for i = 1 to n. That kind of sum is usually quite hard to calculate in closed form. As an approximation you don't calculate the sum, but the integral of sqrt (i) over an interval of width n (since you were summing n values), and if you have no good reason to choose anything different, you would take the integral of sqrt (x) for x from 0.5 to n + 0.5.

Now you need to remember your maths; the integral of x^k is x^(k+1) / (k + 1), and the integral of sqrt (x) = x^(1/2) is x^(3/2) / (1.5), so the integral from 0.5 to n+0.5 is about (n+0.5)^(3/2) / (1.5), making it O (n ^ 3/2).

(This isn't really a proof, but with some experience you know that you could prove it if you had to. And it gives not only O (n) but also a decent estimate how often foo () will be called, about n^1.5 / 1.5 times).

gnasher729
  • 42,090
  • 4
  • 59
  • 119