Search code examples
pythondeploymentpipbackwards-compatibility

Is it a good practice to upgrade all python packages in production to their latest versions?


I am running a fairly complex Django application, for about a year now. It has about 50 packages in requirements.txt

Whenever I need a new package, I install it with pip, and then manually add it in the requirements.txt file with a fixed version:

SomeNewModule==1.2.3

That means that most of my packages are now outdated after a year. I have updated couple of them manually when I specifically needed a new feature.

I am beginning to think that there might be security patches that I am missing out, but I am reluctant to update them all blindly, due to backward incompatibility.

Is there a standard best practice for this?


Solution

  • The common pattern for versioning python modules (and many other software) is major.minor.patch where after the initial release, patch versions don't change the api, minor releases may change the api in a backward compatible way, and major releases usually aren't backward compatible

    so if you have module==x.y.z a relatively safe requirement specification would be:

    module>=x.y.z,<x.y+1.0
    

    note that while this will usually be OK, it's based on common practices, and not guaranteed to work, and it's more likely to be stable with more "organized" modules