Missing Dependencies When Installing Python Packages With Pip

Pip is the best way to install Python packages. It handles most of the pain of tracking down dependencies and installing everything. And usually it does it quite well. But some packages require various libraries to be installed and available on the system. When these libraries are missing the installation will fail.

The pip error messages can be overwhelming (there are a lot of them), but the info you need to fix the problem is there if you know where to look for it. In this case let's take a look at the errors I received when trying to install Scrapy into a new virtualenv. I'm on a recent-ish version of Debian Linux. I happen to be in Python 2.7 as Scrapy is still at 95% tests passing for Python 3 at the time of this writing.

Working in my new virtualenv I started by trying to install Scrapy.

pip install scrapy

Things start out well with the usual download, unpack, and install.

Downloading/unpacking scrapy
Downloading Scrapy-1.0.5-py2-none-any.whl (291kB): 291kB downloaded
Downloading/unpacking cssselect>=0.9 (from scrapy)
Downloading cssselect-0.9.1.tar.gz
Running setup.py (path:/tmp/pip-build-pETEoF/cssselect/setup.py) egg_info for package cssselect

And then we hit a snag. After a bit we run into an error with building some of the dependencies. This cascades and we end up with a slew of errors. This is where it gets a bit overwhelming. But we can zero in on the error to see exactly what we're missing.

Building wheels for collected packages: cryptography, cffi
Running setup.py bdist_wheel for cryptography ... error
Complete output from command /home/joshua/.virtualenvs/scrapycars/bin/python -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-85Mdmv/cryptography/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" bdist_wheel -d /tmp/tmp6bFspgpip-wheel- --python-tag cp27:
Package libffi was not found in the pkg-config search path.
Perhaps you should add the directory containing `libffi.pc'
to the PKG_CONFIG_PATH environment variable
No package 'libffi' found

This error gets repeated multiple times (I don't know why, but I have it five times in a row) and then gets followed by a bunch of other output.

The key here really is that we are missing the libffi package. This package 'allows code written in one language to call code written in another language'. A search for 'debian libffi' showed me that I could install the libffi-dev package.

sudo apt-get install libffi-dev

With this package installed I then installed Scrapy without errors.

The Fix

  1. Don't panic.

  2. Start at the top-most error (others may be caused by the first error).

  3. Search for and install your missing package.

  4. Pip install again.

  5. Repeat as necessary.

But It Works Outside My VirtualEnv

You may find yourself with dependency errors in a virtualenv, but you didn't get any errors when you pip installed from your system Python installation. How can that be? Well, mostly likely you already had the package installed. When you run pip install again it doesn't throw up errors if the package is already installed.

The package may have been installed as part of the distribution and didn't need the missing package as part of its build. Or you may have had the missing package installed at one time and it was removed in the interim.


Comments powered by Disqus