With Travis, usually you'll use pip or setuptools to get your project's Python dependencies installed before running the test script. Here's the minimal .travis.yml example from the Travis docs for Python, one that will install dependencies using pip before it runs nosetests:
language: python python: - "2.6" - "2.7" - "3.2" # command to install dependencies install: "pip install -r requirements.txt --use-mirrors" # command to run tests script: nosetests
There is however some dependencies where this is problematic. Two of those are numpy and scipy, which contain a lot of C code that, with the method just discussed, needs to be compiled every time you run the Travis tests.
Travis allows you to install system packages through apt-get, which is quite cool. And there's already the binary python-numpy and python-scipy packages in Ubuntu, so why not use them?
The problem is that simply installing them via apt-get does not work for the same reason it doesn't work when you do this locally: The default virtualenv that Travis sets up for you to run the tests in is isolated from the system packages, so it won't see those globally installed numpy and scipy packages.
The solution for this is use virtualenv with the --system-site-packages option, which allows you to also import packages from the global site packages directory.
How it works
Add these lines to your Travis configuration to use a virtualenv with --system-site-packages:
virtualenv: system_site_packages: true
You can thus install Python packages via apt-get in the before_install section, and use them in your virtualenv:
before_install: - sudo apt-get install -qq python-numpy python-scipy
A real-world use of this approach can be found in nolearn.