What is Sitecore Docker? Why should you care about it?
In software development you’re pretty much guaranteed to hit a tweet or blog post mentioning “Docker” or “Kubernetes” and the magic they provide. It’s not always clear what these things are, or why they’re relevant to Sitecore developers. A talk at SUGCON in London recently discussed the basics of Docker and Sitecore. The fundamental question of “what is Docker” is still an issue for a lot of people.
Unless you’re one of those crazy eccentrics who prints out blog posts, you’re reading this on a computer. It’s made up of some physical hardware like a processor, memory, disk drives and a screen. Those all come from different manufacturers, so the operating system you’re running provides a handy abstraction of them so that the programs you run don’t need to know whether you’re running the latest whizz-bang graphics card or not. The operating system manages the memory and gives access to the hardware devices – which lets you alt-tab between Facebook and your real work easily.
For many years that model was fine. But then people started to come up with reasons why they might want to run more than one operating system on a computer. Sometimes it’s because we want a way to run a Windows application on a Mac. Sometimes it’s because we have one big computer, but want it to run a number of smaller programs for us – like providing “infrastructure as a service” in a data centre. And sometimes it’s because we want security – so that that dodgy app you downloaded off the Internet can’t actually affect your “real” computer. All these scenarios can be lumped together as “virtualisation”. You have some special software to provide an abstraction that allows multiple operating systems to run at the same time. It gives those child applications access to “virtual” hardware to run on – hence the name.
And this model has been really helpful for years too. But it has one significant disadvantage: It’s very resource hungry. If you want to run a second copy of Windows 10 on your laptop using virtualisation, you need to have enough CPU power, disk space and RAM for complete copies of both things. The whole of both copies of Windows are in memory at the same time, as well as your applications – and both copies of Windows want to own a big chunk of memory and must share your CPU. Plus the disk files that get used to store the virtualised hard drive data for the guest operating systems tend to be very big too – which makes them harder to move around your network.
Now disks and RAM are surprisingly cheap these days, but for many scenarios all that duplication quickly mounts up, slows things down, and gets in the way.
For the scenario where you want to run two different operating systems is difficult to optimise – but someone clever realised that for the “I just want to run applications inside a virtual box” scenario you can make this much better. You can use similar technology to allow your operating system to share its resources with programs inside a virtual box, without duplicating the whole operating system. And that’s where Docker comes in. It started off on Unix as a way run programs in isolated boxes, which are referred to as containers. (Hence the name – Think about dock workers unloading one of those giant modern cargo ships, with hundreds of big metal containers on the deck)
The programs in the containers are kept separate from each other, so they can be started and stopped on demand, and a malicious program in a container is trapped there by the security model. And this container storing all the bits of the virtualised application is a disk file – so it’s easy to move about between machines. But they’re also more lightweight because the containers include just the code they need to run, not an entire operating system. That makes it much easier to build new containers, copy them about your network, and run a number on one machine – because they’re small compared to old-style virtual machines.
And the ease of starting them up and copying them about makes them ideal for cloud services. If you can put your website into a Docker container, and it starts getting heavy load it’s easy to copy the container to a new server and spin up more instances of your site’s server(s) to help cope with the load. And you can take them down again afterwards easily too – it can behave a bit like the auto-scaling in Platform-as-a-Service deployments.
And as the IT world started to see the benefits of using Docker, Microsoft and other big software companies have started to build their own services. Microsoft support Docker on Windows now, and you can run these containers locally or in the Azure cloud. And other services built on a Docker style approach have grown up – such as Kubernetes, which provides tools to manage many containers at a time across your servers.
Well there are probably two key reasons here: deployment and developers.
Hopefully, the description of what Docker and its related services above makes you think that hosting websites under Docker can have benefits. The ability to scale your site by running more containers, while being able to use fewer server resources is good on its own. And the idea that you can push containers up to Azure (or other cloud services) to have the same thing hosted externally could be useful too.
But the thing that excites me most about Docker is a benefit for developers. If you can parcel up a website inside a container, it makes it really clean and easy to switch your development environment between clients.
When you work in an agency or freelancing roles, it’s not uncommon to end up with many instances of Sitecore installed on your computer. And when you end up with multiple instances of Sitecore, you probably end up with many versions of Solr and xConnect installed too. Managing those, to ensure the right stuff is running for whatever project you need to develop for can be a bit of a pain, and getting set up initially also involves a bit of work.
But with Docker, you just need containers for each project. When you start a project you can create the right Solr, xConnect and Sitecore containers for the project’s requirements. And then when you need to work on it, you just get Docker to start the resources for that site. That’s much tidier for people with lots of projects, and it means when a new developer joins your team, getting set up can be as simple as “pull down the right containers and start them”.
Sadly we’re not quite here yet though. While the community has done some sterling work in getting Sitecore to run in Docker for Windows, it’s not officially supported by Sitecore yet. But based on what is heard at conferences and in the community there is work being done on this – so hopefully it’s just a matter of time before this model becomes common.