Note: I am not that experienced in Python, therefore my code may not be as good as it could/should be.
I am attempting to create a tool to facilitate calculating the algebraic factors of a certain form of number (see https://en.wikipedia.org/wiki/Aurifeuillean_factorization). This is mostly as a test/learning experience, however I have run into a problem when attempting to calculate the parameter "c", which is defined as 2^(2k+1)+1. The addition step does not work for me. I am simply getting the returned value as 2^129, instead of 2^129+1 as I am looking to get. Is this an issue with Python itself, or am I making some sort of mistake in this.
Code:
import math
def make_aurifeuille_factors(base, exponent):
if base == 2 and exponent % 4 == 2:
k = (exponent - 2) / 4
c = int(1 + 2 ** (2*k + 1))
d = int(2 ** (k + 1))
L = c + d
M = c - d
return int(k), int(c), int(d), int(L), int(M)
def gcd(a, b):
return int(math.gcd(a, b))
print(make_aurifeuille_factors(2, 258))
k = (exponent - 2) / 4
makes k
a float
, which means you potentially introduce numerical error in computations down the line. Use integer division to stay in int
world from the start:
def make_aurifeuille_factors(base, exponent):
if base == 2 and exponent % 4 == 2:
k = (exponent - 2) // 4
c = 1 + 2 ** (2*k + 1)
d = 2 ** (k + 1)
L = c + d
M = c - d
return k, c, d, L, M