Things I Learnt From Dockerising My Development
I really like Docker. I’ve been using it in anger for the past couple of months and it’s made my deployment-to-production much easier. Where I was really struggling was getting it working for local development.
Now I’ve got it working with a Node stack, here’s what I learnt.
1) Compose Yourself
Most applications that make sense to run in Docker have many components - the language container for the application, maybe a couple of databases and perhaps a web server to run things through. You CAN write these as one behemoth of a container, but you shouldn’t as you’re getting away from the whole point of Docker. Instead, use Docker Compose.
Firstly, it means you can use the official MySQL or Mongo containers yourself. Docker Compose has, at least from my experimentation, better volume support. For a start, it knows where it is in the file structure, meaning you can use “.:/path/to/container” rather than the “$PWD:/path/to/container” you’d have to do on the command line.
The other advantage of Compose is that you can have override files which extend the main Compose file per environment.
2) Turn Up The Volume
Volumes are great. If you’ve done work previously in a Vagrant box then think of these as your shared folders. If you make a change locally, it replicates it on the Docker container and vice versa.
The problem is that volumes makes things much slower. Which leads onto the next 2…
3) NPM FTW
Historically, I’ve used Grunt as my task runner. Largely, this was because it was out first and I was able to move my Gruntfile between different projects fairly easily. I discovered this post by Keith Cirkel a while back and had begun the transition over to using npm script for my build tool.
The project I’m currently working on initially started out using Grunt as the task runner, but I found that when trying to run the tasks in a local Docker container, it would take ages (the dev run task locally would take ~5 seconds to build and run, on Docker was well over 60 seconds). After much investigation, it turned out that it was Grunt (v0.4.5) that was running slowly. As soon as I switched to using an npm script, it was much faster and within my ~5 second target.
I then discovered that nodemon wouldn’t restart inside Docker. However, if you use
--legacy-watch flag it solves the issue completely.
4) Careful What You Depend On
If you’ve used Node for more than 5 minutes, then you’ll know how brilliant npm is. There are a couple of specific problems with it when using a Dockerised development environment.
1) Installing node modules will be slow in a volume
When I’ve logged into the Docker container, I want to install my dependencies. Because you’ve got your
volume configured, the
npm install task is slooooooow. And then it’ll fail for no reason. So the
way to do it is to do the
npm install during the
RUN phase of the Dockerfile.
Which presents the next problem…
2) Node modules will be destroyed in a volume
I want to install my node modules during the time I build my Dockerfile and that gets installed to the
node_modules folder. But then as soon as I SSH into my Compose box, the
node_modules folder has
disappeared. This is because anything set to your Workdir is replaced by the local files because of
The way I got around this is to move the
node_modules one level up once it’s been installed.
3) You’ll lose node binary files
Let’s say you’ve written an npm script to run your app in development using Nodemon. The command looks something like this
This references three things that are in the
bunyan. Because we’ve moved our node_modules one level up, we can’t access these as before. The simple
way of solving this is to install these as global packages.
NB. The DBA container is an application that configures the MySQL and Mongo databases using East