-2

Often when I'm doing an operation comparing an array to itself, I write code along these lines:

function operation (array) {
  for (let i = 0; i < array.length; i++) {
    for (let j = i + 1; j < array.length; j++) {
      // do something with array, i, and j
    }
  }
}

While the runtime of this code (starting j at i + 1) is clearly faster than a normal double-nested loop, I'm struggling to figure out if that's just a matter of a smaller coefficient or if it should be represented by a different big O.

Is there a better big O for the above function than n squared?

Robbie Wxyz
  • 107
  • 4
  • @gnat I'm checking that out right now to see if it touches on this type of case – Robbie Wxyz Aug 07 '20 at 19:26
  • @JohnWu hmm, I'm familiar with O(n log n) in the context of sorting algorithms, I suspected that might be the complexity of this function... – Robbie Wxyz Aug 07 '20 at 19:28
  • 1
    It doesn't make sense to talk about algorithmic complexity without a cost model and a machine model. Or, in other words: it doesn't make sense to count things if you don't define what the things are that you are counting. – Jörg W Mittag Aug 07 '20 at 20:14
  • 2
    @JörgWMittag: you're putting the bar unnecessarily high. Whatever OP is doing is going to be *at least* proportional to the number of loop iterations. Possibly more, but it still makes sense to talk about that first. – Michael Borgwardt Aug 07 '20 at 20:33
  • @MichaelBorgwardt: Depending on the machine model, `array.length` might have a worst-case step complexity of Θ(#elements) pointer dereferences, for example. So, if we are counting pointer dereferences, and are in such a machine model, then the whole thing is *at least* O(n³). – Jörg W Mittag Aug 07 '20 at 21:01
  • 1
    @JohnWu No, it is still n^2. It only becomes n log n when you recurse on a fraction and not a subtraction. – Thorbjørn Ravn Andersen Aug 08 '20 at 14:41
  • 1
    You are showing nothing in the loop that depends on or takes advantage of being sorted. It is just ordinary loop nesting: n * n/2 ==> O(n^2). – Erik Eidt Aug 08 '20 at 15:20

2 Answers2

8

It's still O(n^2), with a constant coefficient of 1/2.

The number of times the inner loop is executed is (i-1) + (i-2) + (i-3) ... + 3 + 2 + 1 with the total number of terms being i. Note that the first and last term add up to i, as do the second and second-to-last, etc. So there are i/2 pairs, each of which adds up to i - which makes a total of i/2 * i = 1/2 * i^2

This is actually the idea famously used by an elementary school aged Carl Friedrich Gauss circa 1785 to outwit his teacher on a make-work assignment.

Michael Borgwardt
  • 51,037
  • 13
  • 124
  • 176
0

Of course, if you know that the data is sorted, you would most certainly use a more intelligent algorithm, such as "binary search."

If you "brute-force loop through the array, without any consideration of its known properties," then of course you richly deserve your fate.

Mike Robinson
  • 1,765
  • 4
  • 10