Consider the following code:
class Foo:
def __mul__(self,other):
return other/0
x = Foo()
x.__mul__ = lambda other:other*0.5
print(x.__mul__(5))
print(x*5)
In Python2 (with from future import print
), this outputs
2.5
2.5
In Python3, this outputs
2.5
---------------------------------------------------------------------------
ZeroDivisionError Traceback (most recent call last)
<ipython-input-1-36322c94fe3a> in <module>()
5 x.__mul__ = lambda other:other*0.5
6 print(x.__mul__(5))
----> 7 print(x*5)
<ipython-input-1-36322c94fe3a> in __mul__(self, other)
1 class Foo:
2 def __mul__(self,other):
----> 3 return other/0
4 x = Foo()
5 x.__mul__ = lambda other:other*0.5
ZeroDivisionError: division by zero
I ran into this situation when I was trying to implement a type that supported a subset of algebraic operations. For one instance, I needed to modify the multiplication function for laziness: some computation must be deferred until the instance is multiplied with another variable. The monkey patch worked in Python 2, but I noticed it failed in 3.
Why does this happen? Is there any way to get more flexible operator overloading in Python3?
That is not a monkeypatch.
This would have been a monkeypatch:
class Foo:
def __mul__(self, other):
return other / 0
Foo.__mul__ = lambda self,other: other * 0.5
x = Foo()
x*9 # prints 4.5
What was done with x.__mul__ = lambda other:other*0.5
was creating a __mul__
attribute on the x
instance.
Then, it was expected that x*5
would call x.__mul__(5)
. And it did, in Python 2.
In Python 3, it called Foo.__mul__(x, 5)
, so the attribute was not used.
Python 2 would have done the same as Python 3, but it did not because Foo
was created as an old-style class.
This code would be equivalent for Python 2 and Python 3:
class Foo(object):
def __mul__(self,other):
return other/0
x = Foo()
x.__mul__ = lambda other:other*0.5
print(x.__mul__(5))
print(x*5)
That will raise an exception. Note the (object)
.