I am building a Python package that I wish to be interoperable among Python 2 & 3.
In the source code directory I'm only using code which can be run in both 2.7 and 3.x, however, in the package-definition level (such as in setup.py
) I'm not really caring about interoperability and am using more modern modules not necessarily backwards compatible with versions <2.7 such as setuptools
and pathlib
.
As far as I know, setup.py
gets executed once pip install
s a module, however,requiring something from setup.py
in the source code after locally installing my package, I observed that the functionality-relevant code has no access to information available only in its parent directory where the package definition is located, which makes me believe these are abstracted.
If I use wheel
to build the package into a Python 2.7 executable, can I expect it to be run in version 2.7 by anyone who downloads it without any further hassles?
If not, what should be my strategy?
I guess this would break if one tries to install your project under Python 2.7 from a "source distribution (sdist
)", but would be fine when installing from a "wheel". Because installing from a sdist does trigger the execution of the setup script (setup.py
), while installing from a wheel does not (wheel distribution archives do not even contain the setup script). This is the case because your "build back-end" seems to be setuptools, with other build back-ends the behavior might be a bit different.
In any case, the general recommendation in the Python packaging ecosystem is to always distribute at least the sdist.
And so I would think that you really should make your setup.py
compatible with all Python interpreter versions targeted (so you should make it compatible with Python 2.7 in your case).