Search code examples
schemelispmultiplicationsicpmit-scheme

SICP Ex. 1.17 - "fast-multiply" slower than "multiply"?


The following function as introduction to this exercise illustrates multiplication defined in terms of addition. This is the simplest "easy to write down", recursive definition.

(define (star a b)
  (if (= b 0)
      0
      (+ a (star a (- b 1)))))

First thing I did when I saw that, following previous exercises, is write an iterative form which doesn't blow the stack:

(define (star a b)
  (star-iter a b 0))
(define (star-iter a counter sum)
  (if (= counter 0)
      sum
      (star-iter a (- counter 1) (+ a sum))))

Exercise 1.17 then encourages us to find an invariant so as to reduce the problem size, the idea being to get from O(n) to O(logn) number of steps (without, when that particular step is carried out, doing anything to update the result - all we do in that step is to reduce/transform the problem definition - that being what is meant by 'find an invariant') (see line 3 of the first code block below - nothing is being added to the result in that step).

For the fast version, the question says we should use the functions halve and double and seems to imply these would be available as machine operations (constant time?). I've implemented the "fast" version just spoofing those functions as follows:

(define (fast-star a b)
  (cond ((or (= b 0) (= a 0)) 0)
        ((even? a)      (fast-star (/ a 2) (* 2 b)))
        (else      (+ a (fast-star a       (- b 1))))))

And the same thing in iterative form (i.e. O(1) space):

(note how + a on line 4 above just moves to the accumulator, end of line 6 below, to get this in tail position)

(define (fast-star b)
  (fast-star-iter a b 0))
(define (fast-star-iter a b sum)
  (cond ((or (= a 0) (= b 0)) sum)
        ((even? a) (fast-star-iter (/ a 2) (* 2 b) sum))
        (else      (fast-star-iter a       (- b 1) (+ a sum)))))

So this is kind of a "what's the point" question - these functions are slower than the first two given above. The first of these four functions blows the stack so it's not useful. But the second doesn't. That one is faster than any of these two "fast" versions under my testing.

Am I missing anything here? Curious if there's a way to implement halve and double so they actually give the log(n) result being suggested here. There must be otherwise there would have been no sense in the question.

Note that the order of a & b matters a lot if they are different sizes - like multiplying 2, 100 times or 100, 2 times, the first being 100 steps, the latter 2 steps. That would be something to add to this function later. But curious about halve and double to start with.


Solution

  • There's a subtle bug in your code, that's why it's slow. This should fix it, for versions 3 and 4:

    (define (fast-star a b)
      (cond ((or (= b 0) (= a 0)) 0)
            ((even? b) (fast-star (* 2 a) (/ b 2.0)))
            (else (+ a (fast-star a (- b 1))))))
    
    (define (fast-star-iter a b sum)
      (cond ((or (= a 0) (= b 0)) sum)
            ((even? b) (fast-star-iter (* 2 a) (/ b 2.0) sum))
            (else (fast-star-iter a (- b 1) (+ a sum)))))
    

    The idea is to keep adding a and decreasing b at each iteration, but depending on the condition sometimes you were decreasing b and sometimes you were doubling it! Also notice that I'm dividing b by 2.0 to get rid of exact arithmetic, which is slower.

    Of course, you could do things the other way around: adding b and decreasing a - the important part is to be consistent about it, halving the problem on one parameter and doubling the other parameter, and the one that was doubled is the one that we need to add to the final result.