Docker and the future of deployment automation

Docker and the future of deployment automation

From analyst to application developer, if you are in the field of software & web application development, chances are you’ve heard of Docker. If not, you should get familiar with it because it has gained a great deal of momentum as of late. So much so that it has compelled giants like Google, Amazon & VMWare to build support for it in their managed services products. So what is Docker? Fundamentally, it is a tool for running applications inside “containers”, and provides the ability to package up these containers in order to make sharing and/or shipping the application more streamlined, easier, and efficient.

Difference between Containers & Virtual Machines

When people are told the definition given above, they often compare Docker to Virtual Machines — rightfully so. There are definitely similarities, broadly speaking. However, there are some fundamental differences which account for the true value that Docker brings to the table. A Virtual Machine essentially is a simulation of a fully functioning computer, complete with operating system and all. Containers however will share the operating system kernels of the host computer on which they are deployed. Their package mainly consists of the different libraries and the contents of the application which they are running. All the extraneous items are left out, and only items that are essential to the software which the container is supposed to run, are included. This makes containers smaller in comparison to VMs which can often be quite bloated and consist of redundancies.

Why Docker?

So why has Docker emerged as the prevalent option for containerizing applications? There are several reasons for this, and somewhat subjective. Here are some of the reasons which we feel are the common sentiment in the open source community as of late.

1. Easy to use

Developers sometimes do not have the level of knowledge or familiarity that system administrators or devops may have. This leads to them shying away from dealing with deployments or anything to do with the servers in general. Docker has started to change this line of thinking. Whether you are an application developer or administrator, creating a Docker container and packaging it to be used / deployed elsewhere is as simple as using npm, pip, composer or other package managers which people are familiar with. Docker is firmly following the “build once, run anywhere” slogan, and it shows. All they need to do is create a Docker file (which can be thought of as a manifest file similar to composer.json) which will consist of a set of instructions. These will let Docker know which packages / libraries need to be available for this container to function (e.g. php, apache, mysql, etc.) and any configuration related items required. Docker will do the rest.

2. Fast & Efficient

As we discussed earlier, Docker is extremely lightweight (when compared to a Virtual Machine). This makes it easy to share containers and packages across different teams, servers and environments. A Docker container can be packaged and deployed in seconds, as compared to the hours it might take to spin up a Virtual Machine and configure it, not to mention the amount of re-work that goes into then creating new instances of the Virtual Machine when required. With Docker, you can automate the entire process and repeat it on-demand as needed.

3. Scalable

One of the biggest benefits of Docker is its ability to scale applications with great ease. The fundamental concept behind Docker is that each container should focus on one process. This means that you can build a Docker file that defines several different containers, what code needs to run within those containers, and which libraries, packages or dependencies are needed for the respective containers. It also allows you to integrate with your hosting provider through API keys in order to dynamically spin up and spin down containers as the need presents itself, ensuring that as demand increases, the infrastructure can automatically scale up and not get bogged down. Similarly, as demand decreases, unnecessary containers can be spun down to avoid extra server costs.

4. Docker Hub

The Docker Hub can be thought of as an “App Store” for Docker in some ways. It is an organized & curated list of libraries, packages and/or dependencies which you can use in your Dockerfiles to define what elements are required by your container. Ratings, statistics & stars help you determine which packages are the ones are most popular amongst the open source community, and clear markers letting you know which ones are official versus those that are community supported is a really nice feature to have. All-in-all, this type of organization helps a great deal when it comes to on-boarding new people onto Docker.

Conclusion

While this barely scratches the surface regarding Docker and its strengths, and why we at Cygnis Media are fully embracing everything Docker, we hope it provides a good starting point. There is so much more to cover, such as Docker’s pairing with Kubernetes which in itself opens a whole new world of possibilities in the realm of deployment automation and scalability. We hope to cover those topics at a future date. In the mean time however, we highly suggest trying out Docker and keeping an eye on its developments, as they are coming thick & fast.




Looking for app development services,
advices & best practices?
Contact us