Search code examples
pythonfunctionmathtaylor-series

Error in estimating pi by Taylor expansion


I am trying to calculate the value of pi, but there is some semantic error in my logic which I am not able to figure out.

def taylor(precision):
    iter = 1
    sum = 0
    fx = 100
    sign = 1

    while (abs(fx) > precision):

        if not iter % 2 == 0:
            print(sign)
            sum += ((1 / (iter)) * sign)

        my_pi = 4 * (sum)
        fx = math.pi - my_pi
        iter += 1
        sign *= -1

    return my_pi

This results in an infinite loop.

I am supposed to use this series and find my_pi to a particular precision:

π/4 = (1/1) - (1/3) + (1/5) - (1/7) + (1/9) - ...

Pretty new to programming, any help would be amazing!


Solution

  • This part here

    if not iter % 2 == 0:
    

    means you only sum when the iteration is not an even number, i.e., 1, 3, 5, .... However, you alternate the sign every iteration, including that of the even iterations.

    As a result, you get 1/1 + 1/3 + 1/5 + ....

    Instead, try

            if not iter % 2 == 0:
                print(sign)
                sum += ((1 / (iter)) * sign)
                sign *= -1 # move the sign assignment here