Search code examples
pythonpython-import

Best practices for Python imports in a production project: handling relative/absolute imports across different directories and test cases


I have the following folder structure in python:

my_project/
├── Dockerfile
├── Makefile
├── run.py
├── data/
│   ├── raw/
│   └── processed/
└── src/
    ├── __init__.py
    ├── config.py 
    ├── settings.env
    └── response/ 
        ├── __init__.py
        ├── llm.py
        ├── instances.py
        └── get_response.py

Context

I started with a simple project to get structured responses from an LLM using Python. As the project grew, I decided to make it more production-ready by adding proper structure and organization. However, I'm unsure if my current structure is optimal, particularly regarding import handling.

Current Issues

Initially, when all code was in one folder, imports were straightforward:

# When everything was in one folder
from llm import get_completion
from instances import MyClass

After restructuring and using run.py as the main entry point, I had to modify imports to work from the parent directory:

# In run.py
from src.response.llm import get_completion
from src.response.instances import MyClass

Specific Questions

Is this the correct way to structure a production-ready Python project? How should I handle imports when I want to:

Run tests from a separate test directory? Execute files directly within their folders (e.g., for development/debugging)? Use the if __name__ == '__main__': block with test code in individual files?

Do I need to modify import statements every time I run files from different locations? Is adding project root to Python path when running directly, the best possible option?

Technical Details

Python version: 3.11

Current behavior: Files only run correctly when executed from the parent directory

Desired behavior: Ability to run files, tests, and debug code from any location without constantly modifying imports (if possible)

Currently I cannot run my various scripts from their own location, I have to run them from the project root - how can I make them work from any location?


Solution

  • Here's how I would design it:

    my_project/
    ├── Dockerfile
    ├── Makefile
    ├── pyproject.toml // The newest official packaging file
    ├── tests/
    │   └── test_stuff.py
    ├── data/
    │   ├── raw/
    │   └── processed/
    └── my_project/  // More comfortable namespacing 
        ├── __init__.py
        ├── __main__.py  // Running using python -m my_project
        ├── config.py 
        ├── settings.env
        └── response/ 
            ├── __init__.py
            ├── llm.py
            ├── instances.py
            └── get_response.py
    

    You may then install the project using pip install -e . which will allow you to update the code yet still "install it" for the tests to run comfortably.

    After doing so, tests can be run using python -m unittest discover or simply pytest depending on your framework of choice. They simply import my_project, no need for __main__ but you can if you wish (I rarely run specific files, and when I do I use the -k param to unittest).

    There is no right or wrong, but this is a simple viable setup that I've used in production serving millions of clients.