batect: Build and Test Environments as Code — With a Python Sample Project.
How long does it take you to onboard a new colleague? It takes roughly two weeks to get someone outfitted with all dev, build and test environments. The concept of Build and Test Environments as Code tries to take that trouble away.
And the tool batect makes that possible within Docker.
In the words of the “go script concept”: “You know you’re on a mature dev team when your instructions as a new team member are: check out the repo, run ./batect — list-tasks, ./batect setup, and you’re done.
I’ll show you how to use batect in a Python context. We will:
- setup batect.
- use batect to put the python dependency management into code, without writing down a convention to have virtual environments in some magic place.
- use batect to cache dependencies in docker build steps, and to have them locally available.
- use batect to shell into docker containers right after the setup phase.
- use batect to run an integration test on top of this setup against a Postgres.
And all of that will be in code, possible on any laptop just with docker installed, no Python, Pipenv,… needed. Just clone the repository, change into one of the example directories, hit ./batect dep and you’re good to go.
I love simple examples so I’ll walk you through three of them. They are located at this batect python sample repository.
First Steps with Batect
Do you know that moment when you modify a Dockerfile, run a docker build, wait minutes only to have it break with a random error message, and to modify the Dockerfile again? Until you’re fed up with it, uncomment everything, build the image and open a shell in it?
Maybe your workflow is much leaner than that, but that’s close to mine. So I do need to open a shell in a docker container more often than not.
So let’s try that with batect. Trust me, it’s dead simple! If you want to see the finished example, go to the example0 folder:
- batect is made to be placed in every repository. It installs itself in the sense that you only commit two files “batect” (a shell script) and “batect.cmd”. Once anyone runs “./batect” it will download the appropriate version itself. That’s a handy thing to have, considering we want no operating system dependency.
- Next step is to create a “batect.yml” to list the docker container, the build environment, we want to work in, and the task, opening a shell in it. Here’s the code for that.
Nothing too specific about the YML format, the only thing you might see is the “group” which is made simply to group the tasks on display like above.
So you can now check for available commands, and use the only one that we have, ./batect shell:
$ ./batect — list-tasksUtility tasks:- shell: Install deps; Start a shell in the development environment.$ ./batect shellVersion 0.35.1 of batect is now available (you have 0.35).To upgrade to the latest version, run ‘./batect — upgrade’.For more information, visit https://github.com/charleskorn/batect/releases/tag/0.35.1.Running shell…build-env: running sh -c ‘pip install — user pipenv; /bin/sh’Collecting pipenvDownloading…Successfully installed certifi-2019.6.16 pipenv-2018.11.26 virtualenv-16.7.4 virtualenv-clone-0.5.3$ # shell in container now open
I’ve included the pipenv installation because that’s literally the step I wanted to shell in when things broke. Afterwards, I’d continue to install dependencies via pipenv and see why some fail, then fix it in the container and finally put it back into the Dockerfile.
Installing and Caching Dependencies
Let us continue with our Python build. To start development you need the environment, which includes:
- a Python version
- the dependencies like pandas.
Pipenv is a fabulous tool for that, so we’ll use that. Change into the directory example1. Since the build environment is in a docker container, we’ll have to mount our code into it:
- We’ll mount local .-> container /src; For now, that just contains the Pipfile & Pipfile.lock
- local ./pip-cache -> container /src/.pip-cache; That’s where the dependencies are kept
- local ./pipenv-cache -> container /src/.pipenv-cache; That’s for pipenv, which I don’t want to reinstall again and again. I’m just gonna install it once at setup.
Now at first, I thought I could get the virtual environment out of the container, but for some obvious reasons, this is not gonna work. First, because virtual environments don’t work that way, second because the Python binaries are of course operating system dependent, and third because even with installing only non-binaries, I ran into trouble. So if you want to use the build & test environment to develop as well, you’ll either have to connect your IDE to the docker interpreter which for instance Atom + remote kernel or PyCharm Pro can do or simply use the Pipfile.lock to create your local environment.
Local caching is also great because it speeds up docker development. Where you’d usually have to rebuild every layer, at least you now got a cache for the modules in between.
The code for the mounting and our two tasks, installing dependencies first, caching them and then opening a shell now is this:
The “option: cached” is just for mac users, but to make things operating system independent, simply always use it. Same goes for Windows-style paths which would work, but break on any UNIX based system, so just keep to the UNIX standard. Finally, you see that the shell task now has a dependency which means the build has to be executed first. If you now run ./batect shell it looks like this
$ ./batect shellVersion 0.35.1 of batect is now available (you have 0.35).To upgrade to the latest version, run ‘./batect — upgrade’.For more information, visit https://github.com/charleskorn/batect/releases/tag/0.35.1.Running dep…build-env: running sh -c ‘pip install — user pipenv; python -m pipenv install’Requirement already satisfied: pipenv in ./.pip-cache/lib/python3.7/site-packages (2018.11.26)Requirement already satisfied: …Requirement already satisfied: virtualenv in ./.pip-cache/lib/python3.7/site-packages (from pipenv) (16.7.2)…dep finished with exit code 0 in 13.1s.Running shell…build-env: running /bin/sh$ # shell now open
. Depends on your preferences whether you want to include this “dependency”. If you don’t make sure that everyone understands to run ./batect dep or ./batect setup first.
Adding Integration Tests
If you want to run integration tests against external dependencies, you’ll need to configure them as well, just like your dev & build environments. There’s a lot to gain in wrapping them in docker containers. We’ll use a simple example, a Postgres instance, but you should imagine wrapping something harder to set up, like a properly configured localstack container with an exposed dashboard.
Here is the batect.yml we will work with.
In this case, we define two containers, our build-env that did not change, and a new container named “db”. It uses a postgres image. We use the block
dependencies: - db
to indicate we want the database up and running as well. Let’s check the test I’ve written, the code is this.
That’s a pretty simple connect to the database code. I included a failing test, simply to make sure batect actually also fails if the pytest test fails.
You can run ./batect integration_test_fail to see that two containers start, the test runs, and fails because of network issues.
$ ./batect integration_test_fail…Running dep…build-env: running sh -c ‘pip install — user pipenv; python -m pipenv install’…dep finished with exit code 0 in 24.0s.Running integration_test_fail…db: runningbuild-env: running python -m pipenv run pytest test_sql.py…test_sql.py F [100%]…integration_test_fail finished with exit code 1 in 9.1s.
Luckily this is easily fixed. The test fails because our test container can’t connect to the db, which is of course not located at 126.96.36.199. The simple fix is to use the usual docker-compose friendly syntax of using “db” as a reference and pass it as an environment variable into the container.
The code block for this is
integration_test_pass:description: Run the passing integration test (and take a look what changed)prerequisites: - depdependencies: - dbrun: container: build-env environment: DB_HOST: db command: python -m pipenv run pytest test_sql.py`
That’s exactly the same thing I’d do in test & prod environments. So now run ./batect integration_test_pass and it works!
I’ve only provided some small examples of what is possible. But there are a bunch more that come to my mind right away:
- we’ve been using pre-commit to make sure linting is done the same way on all developer machines. pre-commit has a docker image, so we could skip setup and installation of pre-commit which would otherwise be necessary.
- I’ve been using a continuous test watcher which can be used in batect as well. Then everyone gets to enjoy that capability without configuring pywatch, or PyCharms auto watcher.
- We’ve been using Google Container Tools to test our images, that could be included here as well. The benefit being cross platform compatibility since to best of my knowledge GCT Structure Tests only work on unix & darwin.
- There’s a great post about build and test environments as code and why you should use that by Charles Korn.
- There’s also a bunch of great sample projects located in the batect documentation.
- Here’s the link to batect itself, and the documentation.
- Finally, check out the ThoughtWorks technology radar which is where I stumbled upon this tool.