Skip to main content

How to Set up a Perfect Python Project

Starting a new Python project doesn't have to be a challenge because the basic needs are always the same even for different types of projects. This article presents how to create a perfect initial base that can be used for any Python project.

Definition of the Perfect Initial Base

The foundation of a perfect Python project should:

  1. Have a suitable file and directory structure to organize the entire project, separating application code, testing, documentation, and project configuration.
  2. Use virtual environments to develop the project in isolation, with no outside interference.
  3. Use linting tools for static code analysis to identify defects, formatting, optimization, security issues, etc. at an early stage of development
  4. Use automated tests with reports on test coverage.
  5. Use continuous integration to check code quality as it arrives on the server.
  6. Use a properly adjusted version control to ignore files that should not be versioned.

Except for the file and directory structure which is unique, all other items depend on choices. And there are many options. Virtual environment management, for example, can be done with venv, pipenv, poetry or conda. There are dozens of linting tools such as flake8, pylint, mypy, etc., that are equivalent or complementary. In the end, the choices that form one or another combination depend on technical and personal decisions.

Virtual Environment Management

The management of Python versions, virtual environments and dependencies will be done through the combination of pyenv + poetry (read the previous article).

Initial Directory Structure

To create the initial structure of your project, use poetry new <project_name>:

$ poetry new project_x

The previous command creates the following directory structure:

project_x
├── project_x
│   └── __init__.py
├── pyproject.toml
├── README.rst
└── tests
    ├── __init__.py
    └── test_project_x.py

This is an excellent minimum file and directory structure. It separates the project-specific code in the project_x subdirectory of the code only related to tests in the tests directory, and the project's configuration and documentation files (pyproject.toml and README.rst). However, some adjustments are needed:

  1. README.rst comes empty, and you need to complete it. Creating this type of file is beyond the scope of this article, but you can find good tips and more information in 1 and 2.
  2. Edit pyproject.toml and change the settings created automatically for name, version, description, and authors.
  3. Check the Python version specified in section [tool.poetry.dependencies] in pyproject.toml. poetry new uses the environment version, but you can install and specify other Python versions via pyenv.

Linting and Testing Tools

The recommended minimum set of testing tools is:

  • pytest: testing tool for Python
  • pytest-cov: pytest plugin to measure code coverage

For linting, I recommend using:

  • blue: code formatter based on black
  • flake8: Python static code analysis tool
  • flake8-debugger: flake8 plugin to check for forgotten debug commands
  • flake8-pytest-style: flake8 plugin to check common style issues or inconsistencies with pytest-based tests.
  • isort: tool to sort Python imports.
  • mypy: static type analysis tool.
  • pep8-naming: flake8 plugin that checks against PEP-8 naming conventions.
  • pyupgrade: tool to automatically upgrade syntax for newer versions of the Python language.

And some additional security-specific tools:

  • bandit: tool for finding common security holes in Python code.
  • pip-audit: tool for scanning Python environments for packages with known vulnerabilities.

Installation and Configuration

All libraries and tools related to testing and linting are necessary for the development of the project, but not for its operation in production. They should be installed in a separate section in pyproject.toml to not get mixed up with the essential dependencies. To install them, use poetry add --dev:

$ poetry add --dev pytest=="*" pytest-cov=="*" \
                   blue=="*" flake8=="*" flake8-debugger=="*" \
                   flake8-pytest-style=="*" isort=="*" mypy=="*" \
                   pep8-naming=="*" pyupgrade=="*" \
                   bandit=="*" pip-audit=="*"

Settings

We can keep most dependencies configurations in pyproject.toml, in sections named following the pattern [tool.<tool-name>]:

[tool.isort]
profile = "black"
line_length = 100
[tool.blue]
line-length = 100
[tool.pytest.ini_options]
filterwarnings = ["ignore::DeprecationWarning"]
[tool.mypy]
ignore_missing_imports = true
disallow_untyped_defs = true

Some observations:

  1. PEP8 establishes a style guide for Python code but leaves room for some formatting variations. By setting profile = "black" on line 2, isort formatting is adjusted to be more compatible with the one produced by black/blue.
  2. Lines 3 and 6 change the line size from the default value of 79 to 100.
  3. mypy has several config options. ignore_missing_imports suppresses error messages about imports that cannot be resolved (line 12). disallow_untyped_defs disallows defining functions without type annotations or with incomplete type annotations (line 13).

Unlike the other tools adopted, flake8 cannot be configured in pyproject.toml. We should use a file named .flake8 instead:

[flake8]
max_line_length = 100
exclude = .venv,.mypy_cache,.pytest_cache
ignore = PT013,PT018,W503

Automation

Testing and linting should be easy to run without remembering each command and its arguments. For this, I recommend using a Makefile with the necessary tasks:

test:
    pytest --cov-report term-missing --cov-report html --cov-branch \
           --cov project_x/

lint:
    @echo
    isort --diff -c .
    @echo
    blue --check --diff --color .
    @echo
    flake8 .
    @echo
    mypy .
    @echo
    bandit -r project_x/
    @echo
    pip-audit

format:
    isort .
    blue .
    pyupgrade --py310-plus **/*.py

And then, just use make <task> :

  • make test runs the tests and generates test coverage reports.
  • make lint runs several linting tools in sequence.
  • make format formats Python code according to the patterns used by isort, blue and pyupgrade.

We can use these same commands in version control hooks and in the continuous integration configuration.

Continuous Integration System Configuration

Most modern continuous integration systems keep their configuration in the source code. GitHub Actions, for example, keep your configuration in yaml files, inside the .github/workflows directory. For our project, we are going to use .github/workflows/continuous_integration.yml:

name: Continuous Integration
on: [push]
jobs:
  lint_and_test:
    runs-on: ubuntu-latest
    steps:
        - name: Set up python
          uses: actions/setup-python@v3
          with:
              python-version: '3.10'
        - name: Check out repository
          uses: actions/checkout@v2
        - name: Install Poetry
          uses: snok/install-poetry@v1
          with:
              virtualenvs-in-project: true
        - name: Load cached venv
          id: cached-poetry-dependencies
          uses: actions/cache@v2
          with:
              path: .venv
              key: venv-${{ hashFiles('**/poetry.lock') }}
        - name: Install dependencies
          if: steps.cached-poetry-dependencies.outputs.cache-hit != 'true'
          run: poetry install --no-interaction
        - name: Lint
          run: poetry run make lint
        - name: Run tests
          run: poetry run make test

This configuration works as follows:

  • This workflow will be executed every time the repository receives a push (line 2).
  • The stream will run on an Ubuntu operating system in the latest available version (line 5).
  • Use Python version 3.10 (line 11).
  • Next, install poetry (line 16) and configure it to use virtual environments in .venv directories (line 19).
  • Create a cache policy for the .venv directory (line 25). The key that identifies the cache is formed by the concatenation of the word venv and poetry.lock hash (line 26).
  • Dependencies are installed only if the cache is not found (lines 28 to 30).
  • Run the lint task (line 33)
  • Run the test task (line 35)

pre-commit and pre-push Events

It is good practice to do the code quality check locally, even if continuous integration does the same process on the server again. It saves time because the result is immediate, and corrections can be made outside a continuous integration cycle.

This local check will be automated via version control hooks to trigger actions according to some event. We will need two events:

  1. pre-commit will run make lint
  2. pre-push will run make test

In both cases, the version control operation is cancelled if there are any failures.

To make the developer's life easier, let's add a install_hooks task to the Makefile, which calls the scripts/install_hooks.sh to create the hooks:

install_hooks:
    scripts/install_hooks.sh

And this is install_hooks.sh:

#!/usr/bin/env bash
GIT_PRE_COMMIT='#!/bin/bash
cd $(git rev-parse --show-toplevel)
poetry run make lint
'
GIT_PRE_PUSH='#!/bin/bash
cd $(git rev-parse --show-toplevel)
poetry run make test
'
HG_HOOKS='[hooks]
precommit.lint = (cd `hg root`; poetry run make lint)
pre-push.test = (cd `hg root`; poetry run make test)
'
if [ -d '.git' ]; then
    echo "$GIT_PRE_COMMIT" > .git/hooks/pre-commit
    echo "$GIT_PRE_PUSH" > .git/hooks/pre-push
    chmod +x .git/hooks/pre-*
elif ! grep -s -q 'precommit.lint' '.hg/hgrc'; then
    echo "$HG_HOOKS" >> .hg/hgrc
fi

Some explanations:

  1. On Git, hooks are executable files named according to the desired event, located in .git/hooks.
  2. On Mercurial, hooks are defined in the [hooks] section of the .hg/hgrc configuration file, where each hook can be Python commands or functions.
  3. Both scripts in bash (lines 3-6, 8-11) as the commands used for Mercurial (lines 13-16) do the same thing: change the current directory to the root of the project, where Makefile is located, and run the command poetry run make <task>. Remember that poetry run <command> runs the command within the context of the project's virtual environment.
  4. If a .git directory exists, so the Git hooks are created in the .git/hooks directory (lines 18-21). Otherwise, Mercurial hooks are created in the .hg/hgrc directory (lines 22-23).
  5. The code snippet presented serves those who use Mercurial (my case) and those who use Git. You can remove some parts if only need one or the other.

Preparing Version Control

To prevent unwanted files from being mistakenly added to version control, you need to create a filter list in a particular file located at the root of the project, at the same level as the .hg or .git directory depending on which tool you use. If you use Mercurial, this file should be called .hgignore and should contain:

syntax: glob

.venv
.env
*~
*.py[cod]
*.orig

# Unit test / coverage reports
.coverage
htmlcov/

# cache
__pycache__
.mypy_cache
.pytest_cache

If you use Git, the filename must be called .gitignore and contain the same lines as above minus the first line (syntax: glob), which should be removed.

With the filters defined, we can start version control. For Mercurial, the commands are:

$ hg init .
$ poetry run make install_hooks
$ hg commit -Am 'Initial project structure'

If you use Git, execute:

$ git init .
$ poetry run make install_hooks
$ git add -A .
$ git commit -m 'Initial project structure'

And everything is ready to upload to the official project repository on GitHub.

Ready-to-Use Template on GitHub

It is important to know all the steps to create the perfect Python project. But instead of executing these same steps at each new project, you can just use the template that I made available on Github. Instructions for use are in README.rst.

Final Considerations

The design basis presented in this article works very well and can be easily adapted to other tools, if you want to try a different combination. The most important thing is to maintain the project structure and linting and testing activities automated.

References

1 Make a README
2 READMEs on READMEs (and other README-related resources)
3 How to set up a perfect Python project

Comments

Comments powered by Disqus