Search code examples
recursionfunctional-programmingtail-recursionoztailrecursion-modulo-cons

Tail-recursion optimization in Oz


In the chapter about function in the Oz tutorial, it says that:

similar to lazy functional languages Oz allows certain forms of tail-recursion optimizations that are not found in certain strict functional languages including Standard ML, Scheme, and the concurrent functional language Erlang. However, standard function definitions in Oz are not lazy.

It then goes on to show the following function which is tail-recursive in Oz:

fun {Map Xs F}
   case Xs
   of nil then nil
   [] X|Xr then {F X}|{Map Xr F}
   end 
end 

What this does is, it maps the empty list to the empty list and non-empty list, to the result of applying the function F to its head and then prepending that to the result of calling Map on the tail. In other languages this would not be tail recursive, because the last operation is the prepend, not the recursive call to Map.

So my question is: If "standard function definitions in Oz are not lazy", what does Oz do that languages like Scheme or Erlang can't (or won't?) to be able to perform tail-recursion optimization for this function? And exactly when is a function tail-recursive in Oz?


Solution

  • This is called Tail Recursion Modulo Cons. Basically, prepending to the list directly after the recursive call is the same as appending to the list directly before the recursive call (and thus building the list as a "side-effect" of the purely functional "loop"). This is a generalization of tail recursion that works not just with cons lists but any data constructor with constant operations.

    It was first described (but not named) as a LISP compilation technique in 1974 by Daniel P. Friedman and David S. Wise in Technical Report TR19: Unwinding Structured Recursions into Iterations and it was formally named and introduced by David H. D. Warren in 1980 in the context of writing the first-ever Prolog compiler.

    The interesting thing about Oz, though, is that TRMC is neither a language feature nor an explicit compiler optimization, it's just a side-effect of the language's execution semantics. Specifically, the fact that Oz is a declarative concurrent constraint language, which means that every variable is a dataflow variable (or "everything is a promise", including every storage location). Since everything is a promise, we can model returning from a function as first setting up the return value as a promise, and then later on fulfilling it.

    Peter van Roy, co-author of the book Concepts, Techniques, and Models of Computer Programming by Peter Van Roy and Seif Haridi, also one of the designers of Oz, and one of its implementators, explains how exactly TRMC works in a comment thread on Lambda the Ultimate: Tail-recursive map and declarative agents:

    The above example of bad Scheme code turns into good tail-recursive Oz code when translated directly into Oz syntax. This gives:

    fun {Map F Xs}
       if Xs==nil then nil
       else {F Xs.1}|{Map F Xs.2} end
    end
    

    This is because Oz has single-assignment variables. To understand the execution, we translate this example into the Oz kernel language (I give just a partial translation for clarity):

    proc {Map F Xs Ys}
       if Xs==nil then Ys=nil
       else local Y Yr in
          Ys=Y|Yr
          {F Xs.1 Y}
          {Map F Xs.2 Yr}
       end end
    end
    

    That is, Map is tail-recursive because Yr is initially unbound. This is not just a clever trick; it is profound because it allows declarative concurrency and declarative multi-agent systems.