In your job you focus on development. You do not give much thought to how things are deployed in staging and production.
And yet you might have seen people running things locally on VM using tools as Vagrant and Docker. The below arguments use “Docker”. But each case applies to any VM/containerisation approach applicable during local development.
Why should you do adopt the usage of something like Docker in your local dev? Do you need a VM to run things locally?
No to Docker for local development
I can see why I’d be reluctant to run things on a VM:
- Less stuff installed on your local machine. I.e. less software.
- Able to navigate to the project root and run the script/process that runs your software. Can group this chain of commands into one shell alias.
- Easier to think about. If that process doesn’t run, you do not see the result. E.g. when running a Django project using
./manage.py runserverin your local terminal. Easier to see if it’s running as expected via terminal. Easier to see server responses and any errors occurring.
- You focus on one thing at a time. Especially when learning a new tech, it’s better to run it and tinker with it in your local environment.
- Since it’s less work overall, you have more time left on your hands.
This all sounds good if you’re working on your own. Things might, and usually do, get hairy when working with others.
Yes to Docker for local development
Docker is able to eliminate the “But it works on my machine” problem. In fact Docker is able to replicate the environment a process runs within. And across different “dev” boxes because it’s independent of the OS on which it is running.
This makes it especially useful for teams. For example these are three “real life” scenarios I experienced that made the Docker approach worthwhile:
- Another team is in charge of a “data tracking” microservice project. Your code needs to “HTTP POST” data to this service. With Docker you set that project up on your local machine. You can test your code’s interaction with it, without installing any software natively. Other than Docker, of course.
- You are on the team in charge of this “data tracking” microservice project just described. Other teams need to test what they’re doing against it. Again, Docker simplifies you “distributing” your project for testing purposes to other teams.
- You’re a backend developer. Front-end developers see your project only as a REST API. They’re not interested in setting up the backend components. Such as database, web server process or task queue. Better to have them run something like
docker-compose up. No need for them to install any other software.
This list is by no means exhaustive.
Still Docker is also useful when running things as a “one man” team.
In some cases a “docker image” is one of the building blocks of your deployment chain. In that case better to start “docker-ising” early. The time taken to “docker-ise” will pay off when the time to deploy arrives.
Reduce surprises when going to production
With Docker you “test” that what you’re doing works on the target production environment. Let’s say you’re writing a scraper that uses Selenium to “spawn” a browser session. And, among other things, save a screenshot. And finally run some image analysis on it. You know this “runs on my machine” without problem. But your machine is usually a Mac/Win dev box, whereas it will be running on Ubuntu/Linux in production. How to lower the chances of possible surprises due to different environments? Docker.
Local dev box “cleanliness”
Working on a bunch of microservices? Or are you a freelancer juggling projects? Both cases are commonplace. And both imply many projects.
In this case Docker can keep your local dev box “clean”. How? By creating environments inside containers. And then loading the projects in their respective container.
Your dev box ends up with much less packages and virtual environments. Packages and environments needed to run everything “natively” on your dev box.
Especially when juggling several projects. Or when you expect to be “done” with a project within a few days. It’s cleaner to install all dependencies inside a docker container. To “clean up” you remove that container after switching it off.
Working with specialists on a contract basis
Your focus is backend. You have set up your Django project use Docker on your local.
You have crossed paths with a FrontEnd engineer who will help you on your project. She brings on skills that you do not have.
You want her to run the application to cover all execution paths. Rather than give her static URIs and get the static markup (HTML+CSS) back. You want them to edit the templates directly.
One option would be set up the project on their machine, be it Mac, Windows or Linux. The better option would be to:
- Create fixtures with the application’s data. Data that covers all logical paths currently supported by the application.
- Give her access to the repo. Have her run the docker container. Load the test data on the container. Edit the templates in her repo. Commit to a branch. Done! Thank you Docker!
This article used Docker, but same the pros & cons described apply to similar tech as Vagrant. To compare the two, I’ll quote the Vagrant folks in the comparison they made here:
Vagrant is a tool focused on providing a consistent development environment workflow across multiple operating systems. Docker is a container management that can consistently run software as long as a containerization system exists.
Another, more detailed, comparison of the two can be found here. It includes a walkthrough on running a basic instance with both.
Let me know if you have any other arguments for or against running Docker even when doing solo development. I know it feels cumbersome sometimes, but more often than not the initial effort pays off.
You now want to get your hands dirty. This article goes into:
- Setting up Docker locally on your dev box
- Running Django in a Docker container on the same dev box
- Debugging code running on your Docker container!