Lukáš Vinclav08.05.202115 minutes
When developing a Django application, there are several ways how to manage project dependencies. In this article, it will be explained which approaches can be used (pip, pipenv, poetry) to take care of dependencies. Of course, the article does not aim to explain all of them nor provide detailed instructions on how they can be used.
The first one, which is most common and simplest, is to install all packages into the current virtual environment. This virtual environment needs to be created manually, it can be done that by python -m venv venv and then source bin/activate to obviously activate/switch into the new environment. There are some other ways how to create Python virtual environments but it is out of the scope of articles to describe all of them. By using the command pip freeze > requirements.txt, all dependencies installed in the virtual environment can be saved into the main requirements file. Next time when the project is configured this requirements.txt can be used to install these dependencies by running pip install -r requirements.txt.
The main issue is that absolutely all packages are saved with exact versions so updating of dependencies must be done manually. At the initial dependencies file creation, this is not a problem at all, but as time goes it is needed to keep up with the most recent packages due to security patches or new features and this is something that is a tedious task to maintain manually.
Pipenv is Python dependency management tool. It comes with the features which are solving issues described above coming from using requirements.txt. Pipenv creates the virtual environment automatically so there is no need to take care of it anymore. Dependencies can be installed by using the command pipenv install some_package. Once the command is run dependency is saved into Pipfile configuration file holding and all package dependencies for projects. When the package version is changed all dependencies versions are recalculated. Meanwhile Pipfile.lock is created with exact package hash adding security protection into the installation process. So far Pipenv solves plenty of problems as handling virtual environment or dependencies management. On top of that, it adds some security protections. Using Pipenv comparing to regular requirements.txt is the better solution.
Anyway, there are some doubts when it comes to Pipenv. In the last couple of years development of this tool was stagnating and for a long period of time, there were no updates. In the last year 2020, some updates were releases but it is still a sign that regular development can be stopped or the project can be completely abandoned. As a developer, the aim should be to use choose tools that will be maintained for years. It is nothing worse than opening the project in few years and finding out that used tools are not supporting recent features or they don’t work under the new operating system.
Python has officially introduced the project configuration file as the primary source of truth for dependencies management and configurations called pyproject.toml. Unfortunately, this configuration file is not supported by Pipenv (it was mentioned it is using Pipfile). If the project requires some dependencies configurations for tools like Black or pytest, pyproject.toml can be used but with Pipenv it is needed to have another config file called Pipfile so at the end two files have to be maintained when everything should be in one.
Poetry is another dependency management tool combining all features altogether. One tool will carry virtual environment configuration, dependency management, and the best practices in terms of plugins configuration. Basically, it merges all the best features of the methods described above.
One of the neatest features coming from using Poetry is mandatory usage of pyproject.toml config file in the projects. Mostly everything that is related to dependencies and project properties can be specified in one file without splitting configurations across several files.
After successful installation of Poetry by running pip install poetry, a new configuration file can be created by applying poetry init in the project's directory. When the command is run, the user is asked to answer some basic questions about the project.
$ poetry init This command will guide you through creating your pyproject.toml config. Package name [project]: project_name Version [0.1.0]: 1.0.0 Description : sample project Author [None, n to skip]: John Smith <firstname.lastname@example.org> License : MIT Compatible Python versions [^3.9]: Would you like to define your main dependencies interactively? (yes/no) [yes] no Would you like to define your development dependencies interactively? (yes/no) [yes] no Generated file
Below is an example of the pyproject.toml configuration file which is created after poetry init. Executing this command is however not mandatory and a custom bootstrap template with more advanced settings can be created, which can be just copy-pasted between projects.
[tool.poetry] name = "project_name" version = "1.0.0" description = "sample project" authors = ["John Smith <email@example.com>"] license = "MIT" [tool.poetry.dependencies] python = "^3.9" [tool.poetry.dev-dependencies] [build-system] requires = ["poetry-core>=1.0.0"] build-backend = "poetry.core.masonry.api"
Adding a new plugin into dependencies is easy by using poetry add plugin-name. This command will update pyproject.toml and it will resolve all necessary dependencies which are going to be saved into poetry.lock file. The purpose of poetry.lock file is similar to the Pipfile.lock which was described in the Pipfile section.
$ poetry add django Using version ^3.2.3 for Django Updating dependencies Resolving dependencies... (0.6s) Writing lock file Package operations: 4 installs, 0 updates, 0 removals • Installing asgiref (3.3.4) • Installing pytz (2021.1) • Installing sqlparse (0.4.1) • Installing django (3.2.3)
With the Poetry it is possible to upload custom Python plugins into Pypi repository. With one configuration file, it is possible to handle project dependencies and at the same time configure the package properties in the Pypi repository.
Once the package is ready to be uploaded into Pypi just run poetry build to create build files which are going to be used in Pypi.
$ poetry build Building project_name (1.0.0) - Building sdist - Built project_name-1.0.0.tar.gz - Building wheel - Built project_name-1.0.0-py3-none-any.whl
After a successful build, there is only one remaining step to do. After running poetry publish everything is going to be uploaded into the Pypi package repository.
$ poetry publish Publishing poetry (1.0.0) to PyPI - Uploading project_name-1.0.0.tar.gz - Uploading project_name-1.0.0-py3-none-any.whl
With this approach, setup.py as the main configuration file for the plugin is obsolete and it can be removed from the project which leads to a smaller amount and less maintenance of configuration files in the project.
As all dependencies are installed, some of them can require a specific configuration file. For example pytest, isort or black. When all these files are created, the project directory can be cluttered with plenty of .cfg or .ini files. But when Poetry is used, which forces to use pyproject.toml, all configuration can be stored in one central file.
tool.isort] default_section = "THIRDPARTY" known_first_party = "project" known_django = "django" sections = "FUTURE,STDLIB,DJANGO,THIRDPARTY,FIRSTPARTY,LOCALFOLDER" line_length = 120 multi_line_output = 5 include_trailing_comma = true ensure_newline_before_comments = true use_parentheses = true [tool.pytest.ini_options] DJANGO_SETTINGS_MODULE = "project.settings" python_files = ["tests.py", "test_*.py", "*_tests.py"]
Some plugins like flake8, do not have an option to store configurations inside pyproject.toml but there are still workarounds. Use flakehell application as a wrapper around flake8 which allows using pyproject.toml.
One neat perk coming from using Poetry and the pyproject.toml configuration file is native support in Serverless Framework. When the Django application is deployed in AWS cloud as the serverless function, the main plugin allowing creating python functions called serverless-python-requirements will automatically recognize the pyproject.toml file and it will install defined dependencies into the final zip package uploaded into AWS. The final package is cleaned from development dependencies so they will be not uploaded and they will not increase the final package size.