2

I am highly confuse while calculating time complexity of Merge Sort Algorithm. Specifically on the Dividing Step. In the dividing step we have to calculate the mid point of "n" i.e. "q = n/2".

How it will take constant time. If we assume that divide is an elementary operation and take very less time, then also we should include the time taken by divide.

Suppose If we have to divide very large amount of time then it must be consider while calculating the time complexity. According to me Number of divides depends on "n" i.e. Calculating mid point depends on the array size "n".

For Example,

if n = 2

Number of divides required = 1

if n = 4

Number of divides required = 3

if n = 8

Number of divides required = 7

if n = 16

Number of divides required = 15 and so on....

According to me,

If Recursively Define:-

Number of Divide needed for "n" (input array length = n) <= 1 + Number of divides needed for (n-1). {when n = 0 Number of divides = 0 when n = 1 Number of divides = 0}

Or..........

Number of Divide needed for "n" (input array length = n) <= (Summation of 2^i from 0 to n-1) - (2^n)

So, it depends on "n". So how we can take it as a constant time????

Robert Harvey
  • 198,589
  • 55
  • 464
  • 673
Bhaskar
  • 47
  • 1
  • 5
  • 1
    Possible duplicate of [What is O(...) and how do I calculate it?](http://programmers.stackexchange.com/questions/132331/what-is-o-and-how-do-i-calculate-it) – gnat Sep 26 '16 at 13:00
  • @Ordous then how we write the complexity of Dividing stem is Big O(1) – Bhaskar Sep 26 '16 at 13:05
  • Right, sorry, I confused it with quicksort. It is constant. We would need a number of divide steps, however at each recursion level, we would need much more merge time than divide time (logn vs n), so it's discarded from final analysis as a minor term. – Ordous Sep 26 '16 at 13:12
  • We do this in final calculation but in every book even in CLRS it is specifically written that in dividing step it will take constant time – Bhaskar Sep 26 '16 at 13:16
  • Integer Division is usually treated as a constant time operation for complexity calculations. That is, n/2 is one operation. In practice, division in modern CPUs don't vary much on input length, at least up to largest machine usable word / multi word operation. – Kristian H Oct 17 '16 at 09:30

4 Answers4

2

If we assume that divide is an elementary operation and take very less time, then also we should include the time taken by divide

Big O doesn't actually measure how long a single operation takes. Rather, it measures how well an algorithm will scale as you increase the number of single operations.

Consider this graph:

enter image description here

The horizonal axis represents n. The vertical axis represents the total computation time. Notice that, for small values of n, the O(n²) and O(n³) algorithms can actually perform better than their O(1) counterpart. That's because, in this hypothetical example, the single operation in the O(1) algorithm is more expensive than the single operations in the O(n²) and O(n³) algorithms.

I would consider "the dividing step takes constant time" to be rather obvious, but the reason it's constant time is because it is a single calculation.

Robert Harvey
  • 198,589
  • 55
  • 464
  • 673
1

I'm glad you took the time to try to understand this apparent discrepancy. You're correct that we do need to account for the divisions, and that for this algorithm, we have ~n divisions. In general, if we are dividing reasonably small numbers (those that fit in a long), we count that individual division operation as effectively O(1), so we have O(n) divisions.

What you're missing is that it doesn't really matter, because we already have an O(n) operation in the mix: the merge. In this particular case, that's why adding the divisions in doesn't change the overall complexity from O(n log n). Accounting for the divisions would change it to O(2n log n) and constants get factored out.

Karl Bielefeldt
  • 146,727
  • 38
  • 279
  • 479
  • I am not talking about whole merge sort algorithm I am only talking about Split part of Merge Sort Algorithm. In every Book Even in CLRS it is written that Dividing step take constant time and it is Big O(1). In my opinion it should be Big O(log n) – Bhaskar Sep 26 '16 at 18:21
  • 1
    A single division step is done in constant time, but in aggregate for the entire algorithm, it's clearly `O(n)`. I'd have to look at the exact wording in the book, because that distinction is easy to mistake. – Karl Bielefeldt Sep 26 '16 at 18:31
  • 1
    The exact line is this " The divide step just computes the middle of the subarray, which takes constant time. Thus, D(n) = O(1)." – Bhaskar Sep 26 '16 at 20:01
0

At this point, all the author says is that a single divide step costs O(1). The author is not making a claim about doing 3 divide steps, or 1000000 divide steps.

Rufflewind
  • 2,217
  • 1
  • 15
  • 19
0

The number of times to compare is the reason of time complexity for most sorting algorithms. In any divide and conquer algorithms, the maximum number of times to divide is n-1 which is smaller than nlog(n), thus it is negligible.