I keep running into problems when trying to write code to multiply two complex numbers. I first created a simple class called Complex
:
class Complex:
def __init__(self,real,imag):
self.real = real
self.imag = imag
def multiply(self, d):
self.real = (self.real * d.real) - (self.imag * d.imag)
self.imag = (self.imag * d.real) + (self.real * d.imag)
def __str__(self):
if self.imag > 0:
return str(self.real) + " + " + str(self.imag) + "i"
elif self.imag == 0:
return str(self.real)
else:
return str(self.real) + " - " + str(-self.imag) + "i"
Where self.real is the real part and self.imag is the imaginary part. When I run
z = Complex(3,4)
x = Complex(2,3)
z.multiply(x)
print(z,x)
The result should be -6 + 17i
, but instead it outputs -6 - 10i
. What's happening? Is it something to do with the self.imag
part in the multiply function?
I tried doing it by hand by reading my code, and I got the correct answer. Is there something obvious that I'm missing?
You modify self.real
in the first line of your multiply
method, then use the modified value in the second line. Do both assignments in a single line, i.e.:
self.real, self.imag = (self.real * d.real) - (self.imag * d.imag), (self.imag * d.real) + (self.real * d.imag)