The short answer is that this is just the definition of an asymptotic limit. We can try a more intuitive construction though:
Let's say we think our algorithm A is linear, so the order of growth is just θ(n)
. We don't know the coefficient: maybe it actually requires 2.7 "resources" per additional operation on average, so R(n) ≃ 2.7n
, but this exact value is hard to determine.
Now, even if we can't easily determine this value of 2.7
, we can perhaps determine that it can't be less than 2, and can't be greater than 5. So long as we can demonstrate that 2n <= R(n) <= 5n
always holds, we can be happy that the algorithm is indeed linear, without needing to know exactly what the coefficient is.
Conversely, if we can't find any positive lower limit k1
such that k1.n <= R(n)
holds for sufficiently large n, our algorithm must actually by sub-linear. For example, there's no positive constant we could choose that would work if R(n)
is really logarithmic. Similarly, if R(n)
is really quadratic or exponential, no upper bound we choose will stop it shooting off out the top of our inequality.
Visually, R(n)
is the usual straight line, and choosing some upper and lower constant bounds gives you two more straight lines - one steeper and one shallower. If your R(n)
stays between them as n gets arbitrarily large, it is asymptotically the same "shape" as the linear function. If it's a different "shape", it will eventually fall out the bottom or pop out the top. The same works for logarithmic shape, or exponential shape.