Problem: Different applications might require different versions of the same libraries, which version of those libraries should be installed?
Problem: There are tools installed with the operation system written in Python, they require specific versions of some libraries, and the ones provided by the operating system packages are to satisfy those requirements. How about my application specific requirements that uses different versions of those libraries?
Question: When a library is required, should the one provided by the operation system package management be installed, or the one available on PyPI?
Answer: Use virtual environments and install the exact version that the app requires from PyPI.
Each app could use its own virtual environment where only the desired versions of all required libraries are available. Setting up a virtual environment means using a project specific path (other than the system shared path) to install libraries, and when Python runs, modify sys.path to include this path to search for modules to import.
And using the right tool it's even easier in practice than mentioned here.
Install virtualenv and make a path and create a virtual environment in there. Then using shell helper files (provided by virtualenv), the shell path is modified so when Python is called, it's the one configured to search the virtual environment specific path for modules.
$ sudo pip install virtualenv $ mkdir ~/.virtualenv/myapp $ virtualenv ~/.virtualenv/myapp $ ls -l ~/.virtualenv/myapp/bin/
There will be files named activate for different shells (Bash, Csh, Fish). Bash is very common so:
$ source ~/.virtualenv/myapp/bin/activate
For example I use fish, so:
~> source ~/.virtualenv/myapp/bin/activate.fish
Now check sys.path:
~> python -c 'import sys; print sys.path'
These samples used the default Python available (the Python version that executes when you type python in shell). However it's possible to chose a specific version of Python for the virtual environment (This version of Python should already be installed) For example for the same app, I'd like to setup a Python 3.4 virtual environment:
~> mkdir ~/.virtualenv/myapp-py3.4 ~> virtualenv --python=python3.4 ~/.virtualenv/myapp-py3.4 ~> source ~/.virtualenv/myapp-py3.4/bin/activate.fish ~> python -c 'import sys; print(sys.path)'
By default virtualenv sets up pip as well, so when calling pip, it's the one from the virtual environment, and it will install all the packages in the virtual environment path.
~> pip install -r requirements.txt # installs all packages under ~/.virtualenv/myapp-py3.4
Virtual environments are so useful that starting from Python 3.3 standard library provides a tool to work with them.
Using venv is about the same as virtualenv, only instead of using virtualenv standalone command, the venv module should be used as a script:
~> python3.4 -m venv ~/.venv/myapp-py3.4 ~> source ~/.venv/myapp-py3.4/bin/activate.fish ~> python -c 'import sys; print(sys.path)'
Python uses the ensurepip module to provide pip for the virtual environments created by venv. Unfortunately Python 3.4 distributed with Ubuntu Trusty Tahr does not include this module. So creating an environment without pip works fine, but pip seems to be quite useful most of times. Installing python3.4-venv resolves this issue.
Virtual environments can be used to deploy projects on production as well so multiple projects can be deployed on the same system, using their own versions of libraries.