Managing Python and keeping pip clean
Managing Python (my go-to for scripting) has always been an unorganized process for me. I install it once, never update it, and forget where/how I did install it. And then there’s pip dependency conflicts… Yeah, not great.
Then about a year ago I abandoned Windows and picked up an M1 Macbook Pro. Since I had the opportunity to start with a clean slate, I wanted to make sure things stayed in order - and so far, it’s worked perfectly. Here’s how I did it.
Installing and Managing Python
Having tools like Homebrew available now that I’ve jumped to Mac has been awesome. As it turns out though, it’s not great for installing Python - Justin Mayer wrote a great article that explains why. Instead, I opted for pyenv, a Python version manager (which I did install with Homebrew).
After following the post installation instructions for macos, I installed the latest version of Python (3.10.0 at the time) and set it as the global, which means it will always be used by default unless you specify otherwise.
pyenv install 3.10.0
pyenv global 3.10.0
That’s about as far as I’ve gone with pyenv. I haven’t had a need for different versions of Python, but it’s still nice to have a framework that manages that for me.
Virtual Environments
Conflicting dependency versions are a pain. That’s where Virtual Environments (venvs) come in. Venv is included in Python version 3.3+, and it allows individual projects to have their own independent set of dependencies.
To create a new venv:
python3 -m venv /path/to/virtual/environment
To activate the venv:
source /path/to/virtual/environment/bin/activate
That’s a bit wordy, so I wrote a function that automates the process:
function venv(){
workingdir=${PWD##*/}
if [[ ! -d "./.venv_$workingdir" ]]
then
python -m venv ./.venv_$workingdir
fi
source ./.venv_$workingdir/bin/activate
}
I like to keep venvs at the root of the project in a directory with the same name, but with .venv_
prepended. For example: /code/my-project/.venv_my-project
This function checks the current directory for a venv using that naming convention, and creates it if needed. Then it runs the source
command to activate the venv. Now if I want a new venv, or I want to load the existing one, it’s as simple as running venv
in the project root.
Forcing pip to require an active venv
As easy as that is, I still tend to forget about creating a venv until after I’ve run several pip install
commands. Since I want to keep my global instance of Python clean - no dependencies at all - I need a way to protect myself from myself. It turns out that’s built into pip too.
By setting the environment variable PIP_REQUIRE_VIRTUALENV=true
, you can force pip to require an active venv before it will let you install anything:
> pip install requests
ERROR: Could not find an activated virtualenv (required).
In the event I’d ever want to explicitly run a pip command globally (outside the context of a venv), I wrote a function to handle that too:
function gpip(){
PIP_REQUIRE_VIRTUALENV="0" pip "$@"
}
The script
All in, I ended up with this bash script that I import into my .zprofile
:
#################### Python/VENV/PIP #########################
# Require an activated venv by default
export PIP_REQUIRE_VIRTUALENV=true
# Manually override venv requirement (use when NOT in a venv and want to perform global action)
function gpip(){
PIP_REQUIRE_VIRTUALENV="0" pip "$@"
}
# Create a new venv in the current directory (if it doesn't already exist) and activate it
function venv(){
workingdir=${PWD##*/}
if [[ ! -d "./.venv_$workingdir" ]]
then
python -m venv ./.venv_$workingdir
fi
source ./.venv_$workingdir/bin/activate
}