Search code examples
pythonsignsubtractionuint

How can I force subtraction to be signed in Python?


You can skip to the bottom line if you don't care about the background:

I have the following code in Python:

ratio = (point.threshold - self.points[0].value) / (self.points[1].value - self.points[0].value)

Which is giving me wrong values. For instance, for:

threshold:  25.0
self.points[0].value:  46
self.points[1].value:  21

I got:

ratio:  -0.000320556853048

Which is wrong.

Looking into it, I realized that self.points[0].value and self.points[1].value] are of the typenumpy.uint16`, so I got:

21 - 46 = 65511

While I never defined a type for point.threshold. I just assigned it. I imagine it got a plain vanilla int.

The Bottom Line

How can I force the the subtraction of two uints to be signed?


Solution

  • Almost anything but uints will work here, so just cast these to something else before you do the subtraction.

    Since threshold = 25.0 (note the decimal point), it's a float, so the subtraction and division will all work as long as you're not using uints.