What is Docker?

What is Docker

What is Docker?

If you’re a software developer of the old, you would relate to this problem – The application built in your environment would work perfectly fine, but the minute it changes hands and goes to a different one, say to a tester, suddenly the app malfunctions. Debugging it was tricky as the environments can be of various kinds and solving this problem for each of them was a lot of work.

Another problem that software developers of the old would nod their heads to was an effective utilisation of the server capabilities while building an application. Quite a chunk of the server power and space were stuck and remained unused due to a rigid structure of app development.

The veteran developers would already appreciate what a docker would be and it is relevant for the nascent developers to understand the roots of it.

Docker, if simply put, is a tool designed to create, deploy and run applications by using containers. Dockers are lightweight alternatives to virtual machines at developing micro-services. They also provide a smoother and more efficient way through the Software Development Life Cycle (which involves development, testing, staging and production).

Understanding the problems that dockers solve would help in developing a strong sense towards dockers and its functions.

Developer’s Problem

What’s the problem?

Software developers build applications. For the beginners, applications are a piece of software (lines of code) that are written to make the computer perform a certain task. As the complexity of the tasks to be performed increases, building a suitable application turns into a complex activity too. They might require various sub-tasks in building an application. As the sub-tasks are built together with each activity intertwined with the other, malfunction of any sub-task would lead to slowing down of the entire application, also the process of developing it. The flexibility in the development stage is poor.

Virtual Machine Solution

To tackle this problem, developers started dividing the entire task of building an application into smaller sub-tasks, also called micro-services, and then started building these micro-services separately from one another. For example, to build an application to run an e-commerce website, the tasks are divided into account service, product catalogue, cart server and order server. And each of the services is built independently. By doing so, the development of one micro-service is not affected by the others which in-turn maximises the speed and efficiencies of developing the overall application. This direction of implementation is called as Virtualisation.

They achieved this through the usage of Virtual machines, where the overall server capabilities (disk space, RAM, etc.) are split into multiple segments (each of the segments is called a virtual machine) and each of these VMs had a single micro-service (a sub-task) running on them.

Burdens of the VM

VMs brought in two problems

One, the division of server capabilities was not flexible. There were a few micro-services/applications which did not require the entirety of the space or RAM provided by their VMs and this potential went waste as VMs were distinct environments and they couldn’t share the resources.

Two, VMs, by design, required OSs installed in each one of them separately. So, the machines got heavier and certain latencies were developed into the system like the OS boot time, etc. This slowed down the development, testing and production phases by a fair bit.

Docker as a solution

Imagine a solution where there can be multiple micro-services running in parallel and can flexibly use the resources of the server as per the need and free it up if not necessary. This form of implementation is called containerisation. And Docker is a type of solution in containerisation.

Docker is a simple software platform that enabled running multiple micro-services in a single virtual machine. The micro-services were loaded and run from something called docker containers. Now, each of the containers would run a single micro-service. All of these containers share a single OS and its capabilities as they belong to a single VM.

So, this enables the apps to use the resources of the VM depending on the need and there’s no wastage of the server power and space. On a further note, when compared against the number of OSs to be installed to run micro-services in the case of VMs, dockers require only a single VM to be running. This makes dockers extremely lightweight to use and a dream for anyone who seeks speed and efficiency.

Software Development Life Cycle Problem

What’s the problem?

As mentioned earlier, when the developers developed their app and sent it out for testing, staging or production, the app used to fail often. One of the primary reasons for these occurrences is the difference in the environments these apps were run in. For example, an application developer would have built the app with a certain dependency which requires the latest version of the software. Now, imagine the tester doesn’t have the updated software on the system and hence the application crashes. Now, that is a bummer.

What this means is the app developer has to go through all the environments and ensure that necessary software is available or create an application that is capable of running for lower versions too (this is a bad idea as it can affect the overall performance of the application). Enter Docker.

Docker environment as a solution

Docker has facilitated creating a certain structure all through the Software Development Life Cycle (SDLC) that improved the flexibility at numerous areas. This is how a docker works:

1. As mentioned previously, a single OS system (Host OS) is capable of running multiple micro-services through the docker containers. Imagine a container as a bucket which holds the source code of the micro-service/application and other details of the dependencies and libraries necessary to run that application.

2. The developer writes the code regarding the application, dependencies and requirements in an easy to write dockerfile.

3. This file produces a docker image – which essentially contains all the information about the app and its dependencies. As an extension, a docker container is a runtime instance (that is when you want to run the application) of docker image.

4. These docker images can be further stored in a cloud service called a Docker Hub. This facilitates any other parties involved in the application building process – testing, staging and production – to directly access and download the docker image they want to work on and run it on their environments (in the form of a docker container).

The features of ensuring all the information about the dependencies of the application built-in ensure that the dependency information is made environment-agnostic and any environment that runs the docker image would be aware of the needs of the application. The comparably lightweight property of the docker images makes the usage of the application extremely smooth, allowing the developers to maintain version control while they’re building the apps.

As the need of the problem grows, so would the complexity of the application. There would numerous number (in some cases, 50) of micro-services that are to be developed and assembled to create the application. In cases like these, the micro-services would be heavily built – which includes the dependencies and the requirements, they grow as the complexity grows. A huge scale building of an app requires numerous versions and each of these are added to the git repository.

Now, to ensure that all the environments are aligned and equipped with the necessary dependencies, continuous integration servers, like Jenkins Servers, would keep a tab on the dependencies of the latest version of the app in the git repository. Then, the server would continuously update all the environments connected to the development of the app (testing, staging, production) and ensure that the dependencies are up to date.

Conclusion

To summarise, Docker is a container platform that has multiple powerful uses – Efficient usage of server power and space, decreased latency and lightweight build, a standardised solution to ensuring all the environments are on the same foot which fastens the process of app development with no loss in effectiveness.

Related Blogs

  1. Top 15 Reasons to Learn Docker
  2. Jenkins Vs Docker
  3. Docker Vs Kubernetes
  4. What is DevOps?
  5. What is DevOps Certification ?
  6. DevOps Vs Agile

 

Leave a Reply

Your email address will not be published. Required fields are marked *

Looking for Online Training