Search code examples
z3smtz3py

Taylor expansion trigonometric functions in Z3-Python


I need to design a cosine (and sine) function in Z3, but this is hard and undecidable in general (e.g., see How can I use built-in trigonometric funtions in Z3 Python?).

However, I am fine with approximated precision methods. Thus, I designed the following function (for cos(a), where a in radians) that uses Taylor expansion:

from z3 import *

def calculate_cosine(angle):
    solver = Solver()
    result = Real('result')

    #Note the Taylor expansion for cos (e.g., next term is "+ angle**8/40320")
    #Also, note 2!=2, 4!=24 etc.
    solver.add(result == 1 - angle**2/2 + angle**4/24 - angle**6/720)

    if solver.check() == sat:
        model = solver.model()
        cosine = model[result]
        return cosine

    return None

angle = math.pi / 8  # Angle in radians
cosine = calculate_cosine(angle)
print(cosine)

However, this is just Python code, whereas I would like to perform queries like Exists y. 0<=approx_cos(y)<=1 for which I think I will have to build a Z3-function approx_cos (with typical syntax If... ) which will call, in this case, some NRA solver.

But I am not familiar with how to build such functions, can you help me?

PS: Is there a built-in operation for factorial in Z3?


Solution

  • You can code up the Taylor-series expansion with a simple iterative loop, adding up the terms as you go. Below is an implementation that takes the number of terms you want to include as a parameter:

    from z3 import *
    
    # Approximate cosine using taylor expansion
    # n is the number of terms we want to include
    # Essentially, we implement the formula
    #
    #     sum_{k=0}^{n-1} (-1)^k * x^2k / (2k)!
    #
    # in an iterative way, to reduce calculations.
    def taylor_cosine(x, n):
        if n < 2:
            raise Exception("taylor_cosine: n must be at least 2, got: %d" % n)
    
        s    = 1
        sign = 1
        term = 1
    
        # In each iteration:
        #   sign flips
        #   term gets multiplied by x^2 / (2k * (2k-1))
        for k in range(1, n):
            sign *= -1
            term *= x*x
            term /= 2*k*(2*k - 1)
            s    += sign * term
    
        return s
    

    Note that this function has nothing to do with z3, you can use it as is if you like:

    >> taylor_cosine(math.pi, 5)
    -0.9760222126236076
    

    The more terms you include, the more accurate the result will be.

    But the good thing about this function is that we were careful to write it in a way so it can also be used symbolically. That is, the parameter x can be a z3 symbolic variable. (Note that n has to be a constant: There's no easy/decidable way to support symbolic n. But that's a whole another discussion.)

    As an example use, here's a little test program that tries to approximate the value of pi, by asking z3, successive approximations where the result of the call to cosine is -1:

    def test(numberOfTerms):
      s  = Solver()
      x  = Real('x')
      s.add(And(0 <= x, x <= 2*math.pi))
      cx = taylor_cosine(x, numberOfTerms)
      s.add(And(cx == -1))
    
      print("Testing with %d terms:" % numberOfTerms)
    
      r = s.check()
      if r == sat:
          mx = s.model()[x]
          if(is_algebraic_value(mx)):
              mx = mx.approx()
          vx = float(mx.as_fraction())
          print("  x                    = %s"  % vx)
          print("  taylor_cosine(x, %2d) = %s" % (numberOfTerms, taylor_cosine(vx, numberOfTerms)))
          print("  cosine(x)            = %s"  % math.cos(vx))
      else:
          print("  Solver said: %s" % r)
    

    We simply create a Real, and ask the solver to make sure its cosine is -1. We're careful to handle both real and algebraic-reals, as z3 can produce both kinds of results for queries that involve real values. (Former when the solution is rational, latter when they are algebraic, i.e., as roots of a polynomial.)

    Let's see this in action:

    for i in range(2, 10):
        test(i)
        print("====")
    

    This prints:

    Testing with 2 terms:
      x                    = 2.0
      taylor_cosine(x,  2) = -1.0
      cosine(x)            = -0.4161468365471424
    ====
    Testing with 3 terms:
      Solver said: unsat
    ====
    Testing with 4 terms:
      x                    = 2.751711543190595
      taylor_cosine(x,  4) = -1.000000000000076
      cosine(x)            = -0.9249542537694367
    ====
    Testing with 5 terms:
      Solver said: unsat
    ====
    Testing with 6 terms:
      x                    = 3.087083093614183
      taylor_cosine(x,  6) = -1.0000000000000144
      cosine(x)            = -0.9985147217565723
    ====
    Testing with 7 terms:
      Solver said: unsat
    ====
    Testing with 8 terms:
      x                    = 3.138726428355767
      taylor_cosine(x,  8) = -0.9999999999999999
      cosine(x)            = -0.999995892379266
    ====
    Testing with 9 terms:
      Solver said: unsat
    ====
    

    Observations:

    1. The solver can say unsat (cases 3, 5, 7, and 9): This is because for the number of terms we include for a given run, it may be able to deduce that the expression resulting from the call to taylor_cosine will never be exactly -1. If you don't want this, you should be careful and assert your constraints to allow for some wiggle-room. (i.e., within a user chosen epsilon.)

    2. The result will in general be more and more accurate as you increase the number of terms. We can observe this in a striking fashion: With 2 terms, the value we get back is way off: 2. With 8 terms, we get 3.139, which is not too terrible. And we can see our approximation of pi getting better and better with increased number of terms: 2, 2.75, 3.08, 3.139 etc.

    3. Needless to say, the more terms you include, the more complicated the problem will become for z3. If you only stick to real-valued constraints, then you'll always get an answer from z3 since it does have a decision procedure for non-linear real arithmetic. (Of course, this doesn't mean that it'll be "quick." It's just that it'll be able to decide given enough time/memory etc.) If you mix integer constraints in there, then all bets are off since you'll be in the semi-decidable fragment. My guess would be you'll get unknown more often than you'd like, but it all depends on what other constraints are around and what your actual application is.

    Anyhow.. Hope this gets you off the ground. The crucial thing to remember is that you want more terms for accuracy, at the cost of computational complexity. And since you're dealing with approximations, make sure that your constraints also allow for approximate results; i.e., instead of asking for "exactly this value," ask for "around this value with an epsilon of your choosing." Otherwise, you'd get unsat as demonstrated above.