Misadventures with Docker

By sujal on December 18, 2015 — 4 mins read

The first project to talk about, appropriately, is how I setup the hosting for this blog. It runs WordPress and is deployed at Digital Ocean1 using Docker.

This was my first time using Docker for a production deployment. In fact, trying Docker ‘for real’ was the reason I decided to create this blog.

Why Docker?

I’m not going to do a complete Docker 101 here. Their site does a good job of that, and there are a lot of articles out there, both from “old school” companies like Dell or Microsoft and the new heavyweights like Netflix or Spotify.

For me, I’m interested in this for two reasons:

  1. Development environment consistency: eliminating “it works on my box” and speeding up developer on-boarding
  2. Production repeatability: I like the notion of immutable infrastructure or phoenix servers

Our build engineering and systems engineering teams at work2 are also looking into these same things, so that’s a nice bonus too.

By packaging everything up in a container and running Linux locally for development, it should be a lot easier to bring on new developers, and to have some consistent expectations that code that runs in development will be consistently testable and predictably deployed in production.

I admit, there are a zillion ways to run WordPress more cheaply or more simply that what I’ve done here. Of course, I wouldn’t have learned anything but still, I’d be done with the install in minutes.

That said, WordPress presented a couple of interesting challenges. A typical WordPress install consists of two sets of code that are mixed: 1) the base WordPress install and 2) your themes and plugins. On top of that, there are data directories that can’t be deleted or moved (e.g. file uploads). To be honest, I still don’t have a clean solution for this.

The Solution (so far)

So, here’s how I’m managing this. For local development, I’m using the standard Docker Toolbox and Docker Compose to manage the environment both in development and in production.

First up is my development compose file:

That uses a stock WordPress image and adds my local environment as the WordPress installation.

To build for production, I have a different compose file (specified with the -f argument):

There are a few common elements to both environments, which I’m including here for completeness:

You can see how the data-only container is setup, for example, including the SSL certs.

As you can see, for production, I build an image rather than use and modify the stock image on deploy. Here is the Dockerfile that specifies how to build the image that will get pushed:

As you see, I override my theme locally from my development setup. The line I’ve drawn is simple: my custom theme and the parent theme are my “code,” and everything else is maintained using WordPress’s dashboard.

I currently use this script to try and avoid mistakes pushing to production.

It isn’t perfect. I’m now considering being very precise about date/version checks and only uploading my theme files if they’re newer (especially important for the parent theme, which is commercial and will get updated via WordPress’s internal updater).

Gotchas

There were a few issues with this setup that will need further exploration.

First, WordPress mixes system files with user files in some interesting ways. For example, to configure the theme I need to upload assets. I need to make sure I don’t overwrite or damage the uploads directory in the production install. Second, I’d rather only update the theme files, but then I need to figure out how to install the plugins first.

Second, for local development uploading files is a little tricky. This is why I’ve only mapped things I’m actively developing to the local file system. When I mapped things more broadly3, I ran into issues. In any case where I tried to map a folder where WordPress tries to write in production, I ended up with weird permission issues.

Some folks suggest moving over to another VM instead of the VM provided by boot2docker, which is what docker-toolbox ships with. I still haven’t done that.

Third, I used Docker Machine to work locally AND to deploy to production. But, I work on 2 different laptops. Currently, there wasn’t a built in way to move the Machine setup from one machine to the other without manually copying files from ~/.docker/machine to the second computer. Since that data includes certs, there’s no really good way to recreate that, and the Digital Ocean driver only can setup new docker hosts, not attach to an existing one.

Other notes

I added SSL support to this blog, and tweaked a few other things. You can see the pattern I chose, which was based on a few Stack Overflow answers like this one.

I really love the local development experience, file system permission issues notwithstanding. I’m not 100% sold on production. There are a number of challenges that Docker still needs to solve with that. Still, I’m glad I tried this out. I’ll continue to refine this work as I learn more.


  1. Referral Link

  2. Come work for me in Bangalore or join our teams around the US and the world!

  3. For example, I initially thought I could use the local image to validate new WordPress versions by using my own local copy of WordPress in the development image. That ended up being problematic and unlike how I would deploy, so I decided against that.

Posted in: Devops