Modern businesses run on software. From accounting to ERP, an organization relies on software applications for running most of its activities –from accounts and HR to product development, services or research and analytics. There are various ways in which the software is deployed, and web applications are one of the most common forms of deploying it. And as organizations grow, computing power increases, Internet becomes ubiquitous and bandwidth is not a constraint anymore, software applications become more and more complex. What was earlier done by just a couple of people need an entire team dedicated to develop and support an application. To add to the complexity, most people do not even operate from the same geographical location – the ‘work from home’ culture in software, accelerated in part by the Covid pandemic – allows people to develop code from different places. There is another complication in the way software is developed. To expedite deployment, most programmers rely on a framework that accelerates their work. All these developments have made software deployment a challenging task. The IT department is not sure which version of a particular OS will the language / framework support, and what specs will be needed to make the entire application run smoothly once it is integrated. This however, invariably entailed erring on the higher side as far as server configuration is concerned, and resulted in unnecessary overheads. The concept of a container was introduced in order to mitigate configuration and compatibility issues. Since Docker is a platform for containers, we need to understand what a container is first.
The Concept of a Container
So, what exactly is a container in software parlance? Well, it is exactly what it sounds like! Think about a container that people use to transport stuff from one place to another. The entire environment it needs to make the transit safe is incorporated in the package itself. If it is fragile, we bubble wrap it. If it is perishable, we ensure that it does not get oxidized in transit by adding preservatives, and so on. Applying the same analogy to software, containers are nothing but standardized packages of code. Instead of shipping via a full operating system (OS), developers simply pack their code and its dependencies into an image that can then run anywhere. Essentially then, a container is a packaged bundle that consists of an entire runtime environment that includes the application and all its dependencies, libraries and other binaries, and configuration files needed to run it. Since it is a self contained, sustainable unit, differences in OS distributions and underlying infrastructure are abstracted away (just like functions in coding). And because they are usually pretty small, you can pack lots of containers onto a single computer. By allowing software code to be prepped in ready-made software containers, the code can quickly be moved around to run on servers or even distributed as an app in the cloud.
Docker is an open platform for developing, shipping, and running applications. It is another term for longshoreman – a person employed by a port to load and unload ships.
As the name implies, is a software that runs on Linux and Windows. It is a tool designed to make it easier to create, deploy, and run applications, by using containers. Docker started its life as a platform as a service (PaaS) provider called dotCloud. Docker software was originally built on Linux containers, which were introduced in 2008 to develop containerized applications that share a common OS kernel, eliminating the need for each instance to run its own separate system. Developed with keeping in mind the developers, Docker allows them to focus on developing on their choice of platform, without having to worry about the OS the application will eventually run on. It permits developers to run end-to-end workflow without having to get into services they don’t understand. In other words, it helps them keep focused on their coding task. Additionally, running Docker containers are easy on computer memory, so multiple Docker containers running multiple services creates very low overhead. With Docker, it is possible to deploy an application in a jiffy. Docker was released as an open source project by dotCloud, Inc., in 2013, a San Francisco–based technology startup.
The Docker platform provides tools to develop web applications and its supporting components using containers. The container becomes the unit for distributing and testing the app, which can be deployed as a container or an orchestrated service. This works the same whether your production environment is a local data center, a cloud provider, or a hybrid of the two. Docker can be used for a wide range of workloads, especially in the developing, testing and building stages. Because Docker makes workflow easier for a developer, it is an ideal container technology to use in cloud computing.
Advantages and Disadvantages of Using Containers
- Portability: The first and foremost advantage of using Docker containers is that they can run anywhere, as long as the container engine supports the underlying operating system — be it Linux, Windows, MacOS, or anything else. They can be deployed on a single machine, or on the cloud
- Resource efficiency: Docker containers offer both efficiency of resources and flexibility of usage, as they carry all the files it needs with it.
- Easy on resources: containers do not have to contend for such resources as port numbers, because each container has separate network interfaces. They do not need separate kernels, and require lesser processing power
- Performance: Containers possess the ability to deliver a consistent, standardized and abstracted storage environment
- Testing and bug tracking: is less complicated with containers, as there is no difference between running your application on a test server vs. production
- Security: is a concern, as they share the kernel and other components of the host OS. This means that containers are less isolated from each other
- Maintenance: can be cumbersome: deploying containers in a sufficiently isolated way while maintaining an adequate network connection can be tricky for the uninitiated
Docker Platform: Docker provides tools to develop your application and its supporting components using containers. The container becomes the unit for distributing and testing the app, which can be deployed as a container or an orchestrated service in your production environment
Image: A Docker image is an immutable bundle of all the dependencies and configurations that an application depends on to run successfully. An image is this package that runs inside a container.
Container: A Docker container is a lightweight instance of a Docker image. It is a running process that has been isolated using namespaces and uses the image for its root file system.
Dockerfile: A Dockerfile is a text file that contains instructions to build a Docker image
Docker Engine: When most technologists talk about Docker, they’re referring to the Docker Engine. The Docker Engine is the infrastructure plumbing software that runs and orchestrates containers. It is the master container runtime that runs containers, and into which all other Docker as well as 3rd party products plug in.
Docker is used by many big shot companies for the benefits it provides. It comes in two flavours: Enterprise Edition and Community Edition.
Docker works well with Kubernetes [link to the ‘Kubernetes’ page], which we have covered separately.