Suppose I compile some python files (.py
to .pyc
/ .pyo
) containing code using modules like NumPy, SciPy, MatPlotLib. If I execute them on another configuration (i.e. the client), is it required that versions of modules are the same ? Or have I only to be in the range of compatible versions ?
.pyc
and .pyo
files are just cached bytecode. Python's import machinery is built entirely around strings, and this leaves code that executes an import decoupled from whatever library they import.
So those files are no more tied to the version of the libraries they import than the source code itself. If the source code works with a wide range of versions of the library, so will the compiled bytecode.
You can always take a look at what bytecode Python generates with the dis
module. A straight-up import
statement becomes:
>>> import dis
>>> dis.dis(compile('import numpy as np', '', 'single'))
1 0 LOAD_CONST 0 (0)
2 LOAD_CONST 1 (None)
4 IMPORT_NAME 0 (numpy)
6 STORE_NAME 1 (np)
8 LOAD_CONST 1 (None)
10 RETURN_VALUE
The IMPORT_NAME
opcode takes the name from the co_names
structure that is attached to a code object (stored in the cache too):
>>> compile('import numpy as np', '', 'single').co_names
('numpy', 'np')
It doesn't matter here that the numpy
module consists in large parts of dynamically-loaded libraries; if you replaced the name numpy
with something else that would be imported instead. Modules are loaded at runtime, not at compile time after all.