Search code examples
pythonvisual-studio-codepytestpylint

Using VS Code Python extension - How to install python modules to a permanent location outside of a virtual environment?


I'm working with Python 3.7 and using VS Code as my IDE. I have the Microsoft Python extension installed, and have enabled pylint for linting and pytest for unit tests, installing them to the system python37 script folder, without any virtual environment present or active.

At this point, everything seems to be working. My Python37/Scripts path is on the Windows System PATH, and pytest and pylint main scripts/exes are present there. VS Code can find and run my tests, and lints fine.

Now, I install a virtual environment for the package I'm working on, and change the Python Interpreter to the virtual environment, named .venv.

After doing that, I install all my package dependencies and everything seems to be working, but VS Code pops up windows that tell me pylint and pytest are not installed.

If I click yes to install, it appears to install them with the current default python executable (in this case, the python 3.7 from my .venv virtual environment), but uses the --user option to install it to the users Python\Python37\Scripts folder (in the \users\user\appdata\roaming folder), even though it already exists in my system python37\Scripts folder, that is on the path (installed with no virtual environment activated).

But even after having the extension install them the way it wants, I still can't discover my tests automatically, and when I try to configure tests, it attempts to install to the user directory, yet again.

--

So, all this to ask, I guess - is this a defect in VS Code?

Is there any way to install standard Python tools I want to use, in a permanent location, that VS Code will always find, or do I have to reinstall every tool (for example, pylint and pytest) into every virtual environment I setup?

If I manually install these packages to the virtual environment, it looks like everything works. Is the only option to install them (and all other tools I plan to use globally) into every virtual environment?


Solution

  • A Solution (but reconsider doing it this way - see below)

    I did find a solution for pylint, (though it is a bit kludgy).

    Within the global VS Code settings, for PyLint, search for the VS Code Settings for pylintpath and then find the item listed below and fill it in with the path to your user or system install of pylint. In Windows, it must be in the format shown below, with double backslashes for each path separator:

    enter image description here

    Note that you can also enter an absolute path for pylint just for a given Workspace or Folder by selecting the corresponding tab in Settings.

    There is a similar setting available for pytest, that appears to work the same way. I get some "glitches" with pytest discovery and running (it seems to run each test twice), but it does seem to work.


    Comments from Brett - rethink this strategy

    Based on the comments from Brett Cannon, I will switch to just installing my dev packages in every virtual environment. This appears to be a best practice from long-term developers.

    This makes sense when I consider that the standard tools I might prefer might change from package to package, such as public projects that use different unit test frameworks, or different linters - because all developers on that project need to use the same tools to avoid tool collisions (i.e. think simple differences like how spacing or indentation works, where different linters complain in different ways).

    Solution to install dev tools for every virtual environment

    One solution for this is to create a requirements-dev.txt file, with all the dev tool packages you use, which can be installed using python -m pip install -r requirements-dev.txt.

    This provides a way to have a separate list of the development tools that are required for someone to work on the project, which is distributable, but separated from the standard user install requirements.

    He also suggests that you can daisy-chain requirements files. For example, the first line of the requirements-dev.txt can be -r requirements.txt, which will install everything from the the normal requirements file, and then all the remaining development requirements after that first line. This will install both the normal and dev dependencies with a single command.

    Note / Update (2022)

    Since writing this I have started developing python project with the Poetry dependency/project management package for python.

    Poetry has a separate section in the pyproject.toml file for specifying development dependencies compared to normal run-time or build dependencies, and installs all of it to a virtual environment automatically (with poetry install). There is a learning curve, but not too bad, and makes managing Python projects very easy once you get used to it.