Docker is all the rage of late and it is not easy to comprehend the different areas that you could use it for. It took me a while to understand how it can be applied to my everyday role as a developer and to be frank, I am still learning.
This blog post is a list of use cases that I have found myself applying Docker to and my hope is that it will help you understand it better and trigger your thought processes into using Docker appropriately. Keep in mind that this is not a solution in search of a problem.
Before we get into the use cases, keep repeating this statement in your head a few times : “Docker is a shipping container system for applications”. The words in bold are intentional. It is important to understand to some extent not just what shipping container are but the efficiency that they brought to transportation of any goods.
When it comes to your application (remember it can any application), the shipping container system abstracts and provides a standard container in which to run them.
Can I run a Database Server? Yes, you can.
Can I run my Web application written in Node.js? Yes you can.
Can I just provide an API stub server while I am busy still writing the details? Yes you can.
It’s not about your app. It’s not about what is inside your app to a large extent. Its about packaging, shipping and delivering your app in a standard way.
These are use cases that I use regularly and I would like to hear yours too. So here we go (in no order of importance):
Trying out new software
As a developer, you are always trying out some software or the other. That’s what we live for. It’s not always a pleasant experience to set things up after downloading the software. Time is of essence and sometimes all we are looking for is to fire a few commands and that’s it. The Docker model is a super simplistic way of running software, which behind the scenes takes care of getting the image and running it for you.
It’s not just about new software. As an example, consider that you want to spin up a Database Server (MySQL) quickly on your laptop. Or setup a Redis Server for your team. Docker makes this dead simple. Want to run MySQL Server? All you need to do is : docker run -d -p 3306:3306 tutum/mysql
You could save hours of your time.
Great for Demos
I often find myself giving a demo or two on the weekends to some group or the other. The software stack for these demos varies big time. I am increasingly finding that Docker images as an ideal way to package and demo these applications. This way I stick to a consistent method to package and demo my software. And it is also a great way for the participants to maybe tweak things a bit and then package the images for others to use.
Avoiding “It works on my machine” syndrome
If you have been developing software, you know about it. Actually I should have put it in another way i.e. we all have said this to our Testing Team or fellow developers from time to time. But that’s not the point. The point is that can we have a much more reliable, repeatable and standard process so that the experience of just setting up and running our software is straight forward. Docker gives you that container format and runtime in which to make this happy. Give your testing team an image and straightforward Docker commands to run your application. Relieve them of complex setup instructions of going into some file and tweaking a property or two.
Another area where I find this useful is in a training class. If your intent is to demonstrate an application or two, avoid software setup nightmares on all your participant machines by going the Docker way. You will save hours. Granted that this could be done via a VM but Docker makes it simple, lightweight and hey … you can tell the training participants that they are also going to learn about the hottest topic out there in the software world. A win-win for everyone!
Learning a bit of Linux/Scripting
This might look like a odd reason but this represents a great opportunity for folks not familiar with Linux OS and its scripting languages to get another shot to picking it up. Come to think of it, Linux is important and I am not going to get into the specifics here. If you come from a Windows background, take my advice and get yourself a Linux VM with any of the Cloud Providers. My choice of OS is CoreOS there. But this will really force you to pick up some Linux basics, feel comfortable with the command line and over time begin to appreciate the OS.
Better use of resources
Compared to VMs, I have found Docker containers lightweight to a large extent. Just the granularity of it is a win for me. I often use it to run several containers on my laptop to demonstrate the simplicity, granularity and footprint.
Made for Microservices
You are sure to have heard about Microservices, if you have been following any tech news of late. Docker and Microservices play well together. Conceptually thinking, a Microservice would be a logical piece of your application that can run independently and once you have figured that out, Docker plays well here to help you out in packaging this not just as an image but making it easier for your development team , testing team and possibly even your deployment team to just take the image and deploy your microservices based app.
Porting across Cloud Providers
Most Cloud Providers of note have voiced their full support for Docker. What does this mean for developers. A likelihood that you will be able to move your workloads across different Cloud providers easily. It also makes your job of moving your application into the cloud easily. Why have one way of setting up the software locally and then some other way of setting up / running your software in the cloud. Docker here and Docker there. It should help simplify the last leg in your workflow i.e. deploying much easier and standardized.
APIs are the glue between apps and you definitely have either used a REST API or better still even implemented a REST API. The point here is that before we start working on the implementation of the REST API, you would like to define the API contract and publish that document, so that the client side of the equation can start coding against that interface. To take this to the next step, typically the Server side folks implement a dummy API stub layer so that fake or sample responses are returned from the server.
While some of you might end up saying that Docker is an overkill here and that you could do with sample.json files, think about a Docker container that stubs out your API layer and the client team can access to that.
To explain my point better, I will point you to JSON Server, a Docker image that provides REST API mocking based on plain JSON. You get the picture, right?
Ripe for Tools/Innovation
I don’t think this is a use case but I thought I will just push this in here. Docker is evolving fast. Tools are coming in fast. Have we seen the last of them? Definitely No. There is all the likelihood that the more you use Docker, the more you apply it to your medium to complex applications, it is likely that you will chance about a particular void in the ecosystem. Since it is early days in here, any tools that you build around this are likely to get wide traction in the community. They say it is important to be there early in the game. The current timeline in this whole Containerization World is exactly that. Great opportunities for everyone.
Your Use Case
This is not my use case but your own. I would love to learn what other use cases you have applied Docker too. Do share those in the comments.
Some other points
I would like to cover 2 other points that have greatly helped me in my ongoing journey of learning Docker.
First up is the Docker Hub Registry. This is a public repository of Docker images that you can use today. You should check out the official Docker images that have been made available but more than that, look at the top community contributors and the images that they are publishing. You are sure to find some gems in there that could save you and your teams hours.
If you feel that there is particular image that you need but are not sure if someone has created one, hop on to the Docker Hub Registry first. As the saying goes that if you think you have an idea (a new Docker image), there is a great likelihood that someone has already implemented it (provided the Docker image).
The other one is to look at industry news around the IaaS vendors and see how they have been embracing Docker and the ecosystem that has been springing up around it. While I am not a privy to what might be the actual things going on inside their meeting rooms, it is a safe bet that containerization technology is what they have been running all along and it would be a safe bet to understand this technology as a way to delivering your applications not just locally, not just to your cloud vendor of choice but across cloud vendors in a portable way.
Things are moving towards that … and that can only be a good thing.