17

Given any arbitrarily double-recursive function, how would one calculate its run time?

For Example (in pseudocode):

int a(int x){
  if (x < = 0)
    return 1010;
  else
    return b(x-1) + a(x-1);
}
int b(int y){
  if (y <= -5)
    return -2;
  else
    return b(a(y-1));
}

Or something along those lines.

What methods could or should one use to determine something like this?

  • 2
    Is this homework? – Bernard Jul 05 '11 at 19:16
  • 6
    Nope it's summer time and I like to learn. I figure get ahead instead of letting my brain turn to mush. – if_zero_equals_one Jul 05 '11 at 19:18
  • Hi if_zero_equals_one, would it be right to assume your example is meant to be pseudocode because you're trying to work through the general concept behind this type of problem? –  Jul 05 '11 at 19:31
  • @Mark Trapp yep that's it exactly actually! – if_zero_equals_one Jul 05 '11 at 19:32
  • 13
    Okay, got it. To those voting to migrate this to Stack Overflow: this is on-topic here, and off-topic on Stack Overflow. Programmers.SE is for conceptual, whiteboard-y questions; Stack Overflow is for implementation, issue-while-I'm-coding questions. –  Jul 05 '11 at 19:35
  • 4
    Thanks that's the reason I did it here in the first place. Besides it's better to know how to fish than receive a fish. – if_zero_equals_one Jul 05 '11 at 19:36
  • 1
    In this particular case it is still generally an infinite recursion because b(a(0)) invokes infinitely many other b(a(0)) terms. It would have been different if it was a math formula. Had your setup been different, it would have worked out differently. Just as in math, in cs some problems have a solution, some do not, some have an easy one, some do not. There are many mutually-recursive cases where the solution does exist. Sometimes, in order to not blow a stack one would have to use a trampoline pattern. – Job Jul 05 '11 at 20:49
  • Note that in math `f(x) = 0.5*f(x) + 50x for all x` implies that `f(x) = 100x for all x`. In programming you do not cancel out the terms like that. You are trying to compute stuff, and existential qualifiers do not apply to many languages, other than few special-purpose ones. – Job Jul 05 '11 at 20:53
  • @MarkTrapp [Issue raised on Meta](http://meta.programmers.stackexchange.com/questions/1890/why-are-computer-science-questions-sometimes-considered-on-topic) – Gilles 'SO- stop being evil' Jul 05 '11 at 21:49
  • The functions you have written as examples terminate for very specific values of the arguments and for those specific values it is very easy to see what's happening. –  Jul 06 '11 at 03:19

7 Answers7

13

You keep on changing your function. But keep picking ones that will run forever without conversion..

Recursion gets complicated, fast. The first step to analyzing a proposed doubly recursive function is to try to trace it through on some sample values, to see what it does. If your calculation gets into an infinite loop, the function is not well defined. If your calculation gets into a spiral that keeps on getting bigger numbers (which happens very easily), it is probably not well defined.

If tracing it through gives an answer, you then try to come up with some pattern or recurrence relation between the answers. Once you have that, you can try to figure out its runtime. Figuring it out can be very, very complicated, but we have results such as the Master theorem that let us figure out the answer in many cases.

Beware that even with single recursion, it is easy to come up with functions whose runtime we do not know how to calculate. For instance consider the following:

def recursive (n):
    if 0 == n%2:
        return 1 + recursive(n/2)
    elif 1 == n:
        return 0
    else:
        return recursive(3*n + 1)

It is currently unknown whether this function is always well-defined, let alone what its runtime is.

btilly
  • 18,250
  • 1
  • 49
  • 75
6

The runtime of that particular pair of functions is infinite because neither returns without calling the other. The return value of a is always dependent on the return value of a call to b which always calls a... and that's what's known as infinite recursion.

jimreed
  • 570
  • 3
  • 7
  • Not looking for the particular functions right here. I'm looking for a general way to find the run time of recursive functions that call each other. – if_zero_equals_one Jul 05 '11 at 19:45
  • 1
    I'm not sure there is a solution in the general case. In order for Big-O to make sense, you have to know if the algorithm will ever halt. There are some recursive algorithms where you have to run the calculation before you know how long it will take (e.g. determining if a point belongs to the Mandlebrot set or not). – jimreed Jul 05 '11 at 20:55
  • Not always, `a` only calls `b` if the number passed in is >= 0. But yes, there is an infinite loop. – btilly Jul 05 '11 at 20:58
  • 1
    @btilly the example was changed after I posted my answer. – jimreed Jul 05 '11 at 20:59
  • 1
    @jimreed: And it has been changed again. I'd delete my comment if I could. – btilly Jul 05 '11 at 21:47
  • @btilly I'm thinking about deleting my answer because of the changes in the question. – jimreed Jul 06 '11 at 12:33
5

The obvious method is to run the function and measure how long it takes. This only tells you how long it takes on a particular input, though. And if you don't know beforehand that the function terminates, tough: there's no mechanical way to figure out whether the function terminates — that's the halting problem, and it's undecidable.

Finding the run time of a function is similarly undecidable, by Rice's theorem. In fact, Rice's theorem shows that even deciding whether a function runs in O(f(n)) time is undecidable.

So the best you can do in general is use your human intelligence (which, as far as we know, is not bound by the limits of Turing machines) and try to recognize a pattern, or invent one. A typical way to analyse the run time of a function is to turn the function's recursive definition into a recursive equation on its run time (or a set of equations for mutually recursive functions):

T_a(x) = if x ≤ 0 then 1 else T_b(x-1) + T_a(x-1)
T_b(x) = if x ≤ -5 then 1 else T_b(T_a(x-1))

What next? You now have a math problem: you need to solve these functional equations. An approach that often works is to turn these equations on integer functions into equations on analytic functions and use calculus to solve these, interpreting the functions T_a and T_b as generating functions.

On generating functions and other discrete mathematics topics, I recommend the book Concrete Mathematics, by Ronald Graham, Donald Knuth and Oren Patashnik.

2

As others pointed out, analyzing recursion can get very hard very fast. Here is another example of such thing: https://rosettacode.org/wiki/Mutual_recursion https://en.wikipedia.org/wiki/Hofstadter_sequence#Hofstadter_Female_and_Male_sequences it is hard to compute an answer and a running time for these. This is due to these mutually-recursive functions having a "difficult form".

Anyhow, let's look at this easy example:

http://pramode.net/clojure/2010/05/08/clojure-trampoline/

(declare funa funb)
(defn funa [n]
  (if (= n 0)
    0
    (funb (dec n))))
(defn funb [n]
  (if (= n 0)
    0
    (funa (dec n))))

Let's start by trying to compute funa(m), m > 0:

funa(m) = funb(m - 1) = funa(m - 2) = ... funa(0) or funb(0) = 0 either way.

The run-time is:

R(funa(m)) = 1 + R(funb(m - 1)) = 2 + R(funa(m - 2)) = ... m + R(funa(0)) or m + R(funb(0)) = m + 1 steps either way

Now let's pick another, slightly more complicated example:

Inspired by this page, which is a good read by itself, let's look at: """Fibonacci numbers can be interpreted via mutual recursion: F(0) = 1 and G(0) = 1 , with F(n + 1) = F(n) + G(n) and G(n + 1) = F(n)."""

So, what is the runtime of F? We will go the other way.
Well, R(F(0)) = 1 = F(0); R(G(0)) = 1 = G(0)
Now R(F(1)) = R(F(0)) + R(G(0)) = F(0) + G(0) = F(1)
...
It is not hard to see that R(F(m)) = F(m) - e.g. the number of function calls needed to compute a Fibonacci number at index i is equal to the value of a Fibonacci number at index i. This assumed that adding two numbers together is much faster than a function call. If this was not the case, then this would be true: R(F(1)) = R(F(0)) + 1 + R(G(0)), and the analysis of this would have been more complicated, possibly without an easy closed form solution.

The closed form for the Fibonacci sequence is not necessarily easy to reinvent, not to mention some more complicated examples.

Glorfindel
  • 3,137
  • 6
  • 25
  • 33
Job
  • 6,459
  • 3
  • 32
  • 54
1

First thing to do is to show that the functions you have defined terminate and for which values exactly. In the example you have defined

int a(int x){
  if (x < = 0)
    return 1010;
  else
    return b(x-1) + a(x-1);
}
int b(int y){
  if (y <= -5)
    return -2;
  else
    return b(a(y-1));
}

b only terminates for y <= -5 because if you plug in any other value then you will have a term of the form b(a(y-1)). If you do a little bit more expanding you'll see that a term of the form b(a(y-1)) eventually leads to the term b(1010) which leads to a term b(a(1009)) which again leads to the term b(1010). This means you can't plug in any value into a that doesn't satisfy x <= -4 because if you do you end up with an infinite loop where the value to be computed depends on the value to be computed. So essentially this example has constant run time.

So the simple answer is that there is no general method to determine the run time of recursive functions because there is no general procedure that determines whether a recursively defined function terminates.

0

You create a table where you write down what calculations are done for a, b. Then you take the numbers from the table and guess a hypothesis what the number of calls is depending on x, y and proof it by induction. Your hypothesis may be that the recursion never ends.

gnasher729
  • 42,090
  • 4
  • 59
  • 119
-6

Runtime as in Big-O?

That's easy: O(N) -- assuming that there is a termination condition.

Recursion is just looping, and a simple loop is O(N) no matter how many things you do in that loop (and calling another method is just another step in the loop).

Where it would get interesting is if you have a loop within one or more of the recursive methods. In that case you would end up with some sort of exponential performance (multiplying by O(N) on each pass through the method).

Anon
  • 29
  • 1
  • That's actually what I'm aiming at. I would like to know how to..."comprehend" (I guess that's the right wording) the run time for ANY given set. Say method a() had a run time of N! and each time it calls method b() which not only has a run time of N^3 but it also calls method a(). – if_zero_equals_one Jul 05 '11 at 19:52
  • 2
    You determine Big-O performance by taking the highest order of any called method and multiplying it by the order of the calling method. However, once you start talking about exponential and factorial performance, you can ignore polynomial performance. I *believe* that the same holds when comparing exponential and factorial: factorial wins. I've never had to analyze a system that was *both* exponential and factorial. – Anon Jul 05 '11 at 19:59
  • 6
    This is incorrect. The recursive forms of calculating the nth Fibonacci number and quicksort are `O(2^n)` and `O(n*log(n))`, respectively. – unpythonic Jul 05 '11 at 20:04
  • @Mark Mann - and that contradicts what I said how? – Anon Jul 05 '11 at 20:11
  • 1
    Without doing some fancy proof I would like to direct you to http://www.amazon.com/Introduction-Algorithms-Second-Thomas-Cormen/dp/0262032937 and try taking a look at this SE site http://cstheory.stackexchange.com/. – Bryan Harrington Jul 05 '11 at 20:33
  • 4
    Why did people vote up this horribly wrong answer? Calling a method takes time proportional to the time that method takes. In this case method `a` calls `b` and `b` calls `a` so you **cannot** simply assume that either method takes time `O(1)`. – btilly Jul 05 '11 at 20:34
  • 2
    @Anon - The poster was asking for an arbitrarily double-recursive function, not just the one shown above. I gave two examples of simple recursion which do not fit your explanation. It's trivial converting the old standards into a "double-recursive" form, one which was exponential (fitting your caveat) and one which is not (not covered). – unpythonic Jul 05 '11 at 20:42