January 24 2019

What is Docker? Why should i care about Docker?

Introduction to Docker as a container system as an alternative to classic virtualization.

What is Docker?

Docker represents a revolution in the field of virtualization. If you're already a virtualization expert, you might want to skip some of the next sections. However, if you are new to this concept, a basic understanding of virtualization will be essential to help you grasp the value and potential of Docker.

Docker is a powerful open-source tool for containerization, an innovation that's revolutionizing the way we build, deploy, and run applications. With Docker, your applications are enclosed in isolated environments called "containers", allowing them to operate independently of the underlying operating system and ensure consistent performance, regardless of the environment in which they run.

Docker containers use advanced operating system-level virtualization technologies, including Linux cgroups and namespaces. These tools isolate critical system resources such as memory and processes, enabling the creation of lightweight and portable execution environments. Unlike traditional virtual machines, Docker containers don't require the installation of a full operating system or hypervisor, making them more efficient and streamlined.

One of the main advantages of Docker lies in its ability to overcome compatibility problems between different environments, making it easier to deploy applications on any operating system that supports Docker. Docker also makes it easy to set up development and test environments that closely mirror production environments, since developers can work with the same containers used in production. This strategic use of containers can help reduce infrastructure costs, since containers can operate on a single machine, both physical and virtual, thus eliminating the need for dedicated machines for each application.

What is virtualization?

Let's begin to unravel the concept of virtualization with a simple metaphor: imagine you own a house and you have a friend who needs a place to stay. You have several possibilities to help your friend:

  1. You could invite your friend to share your bedroom, but that option could get uncomfortable really fast.
  2. You could build a new home for your friend on your property, but that might be too expensive.
  3. You could offer your friend to stay in the spare room, thus keeping your lives separate, while sharing some common resources like the kitchen and living room.

The third option represents the essence of virtualization. In computing, virtualization refers to the process of creating a virtual (or simulated) version of a hardware resource, such as a server, storage device, or network.

Now, let's say you want to run a web server on your computer, but you want to keep it separate from your existing operating system and applications. The solution? Virtualization. You can create a virtual machine (VM) on your system, which will host the web server. The VM works like a separate computer, with its own operating system and applications, but uses your computer's hardware resources, such as processor and RAM.

When you start the VM, you will see a completely new OS pop up in a window inside your current OS. This is the equivalent of inviting your friend to stay in your spare room: you share resources (in this case, your computer hardware resources), but maintain separation and independence. This powerful technology makes it possible to make the most of available hardware resources, reducing costs and improving the efficiency and flexibility of computer systems.

What's different about Docker? How is it different from traditional virtualization?

Docker represents an innovative and different approach to virtualization. While a traditional virtual machine encapsulates the entire operating system along with the running application, Docker uses a sharing approach, maximizing the common use of resources between virtualized systems. This strategy allows Docker to consume fewer resources when running and makes Docker containers easier to deploy, both for developers and for the production environment.

Traditional virtualization, offered by hypervisors such as VMware or Hyper-V, generates a separate execution environment known as a "virtual machine" (VM). In a VM, a complete operating system runs, allowing several operating systems to run concurrently on a single physical machine. However, this approach results in a significant overhead of system resources, as each VM needs its own memory, CPU, and disk space.

Instead, Docker employs a technology called containerization to create isolated execution environments, called "containers.". These containers share the operating system kernel of the host machine. This means that, unlike VMs, Docker containers don't need an entire operating system to work, but use the shared resources of the host operating system. This feature allows Docker to create execution environments that are lighter and more portable than VMs. In fact, containers can be moved seamlessly between different physical or virtual machines, without the need for modifications.

In summary, while traditional virtualization leverages hypervisors to create completely separate virtual machines, Docker leverages containerization technology to generate isolated execution environments that share the kernel of the host operating system. This unique approach to virtualization makes Docker an efficient and flexible solution for developing, testing and deploying applications.

Docker plays a vital role for web developers, providing powerful tools and facilitating many day-to-day operations.

One of the main advantages of Docker is the ease of sharing development environments. If you and I were collaborating on a Node app, for example, we'd like to ensure that we both had Node installed and were using the same version to ensure consistency across our environments. Version inconsistencies can cause hard-to-find problems, as libraries and our code may behave differently between different versions of Node.

A possible solution is to install the same version of Node for both, but if we already have other projects on our systems that require different versions of Node, we should consider installing NVM, a tool that allows us to switch versions easily. At this point, we could add an .nvmrc file to the project, specifying the version we intend to use. This process, however, can be quite labor intensive, and despite implementing all of these steps, we cannot guarantee that the environment will be the same for all developers.

Docker offers us a solution to these challenges by allowing us to provide the same development environment for all developers. With Docker, the process becomes:

  1. Install Docker.
  2. Write a Dockerfile.
  3. Run docker build -t <image-name>.
  4. Run docker run -p 3000:3000 <image-name>.

This process may not seem much simpler than configuring Node/NVM, but it offers one major advantage: installing Docker is a one-time operation, regardless of the technology stack you intend to use. With Docker, instead of having to install specific software for each stack, you simply write a different Dockerfile (or Docker Compose file, depending on the complexity of your app).

A Dockerfile is a simple text file, with no extension, which defines the configuration of a Docker environment. For example, here's what a Dockerfile for a Node app might look like:

# This Docker image will be based on the Node 11.6 image
FROM node:11.6.0
# Install dependencies
COPY package*.json ./
RUN npm install
# Copy the node app from the host into the image at /app
COPY . /app
# Expose port 3000 and start the app
PRESENTATION 3000
CMD npm start

This Dockerfile is for a Node app that listens on port 3000 and is started with the command npm start. By placing it in your project's repository, onboarding new developers becomes simple and 100% consistent: every developer always gets the same environment. In essence, Docker is a powerful tool that makes developers' lives easier, increases efficiency, and fosters consistency across development environments.

Develop on the same environment as production

Once the app is installed in a Docker development environment, you can ship the entire container directly to production. If you think it is a problem to deal with the inconsistencies between two developers, just wait for you to write the code that works on your machine just to make sure that not functions in production. It's extremely frustrating.

You have tons of options for deploying Docker containers to production. Here are some of them:

I like Heroku's approach because it's the only one that allows you to simply ramp up your project with a Dockerfile to run them. Others take many other steps like pushing the Docker image to a repository. The extra steps aren't the end of the world, but they're not necessary.

What about more complex apps?

Due to Docker's philosophy (one process per container), the most apps will require multiple containers . For example, a WordPress site should consist of a container for the web server running PHP and a container for the MySQL database. This means you need a way for the containers to talk. This is called container orchestration .

If you can run all containers on a single host, Docker Compose it will probably meet the orchestration needs. It's included when you install Docker and it's easy to learn. It allows you to launch multiple containers at the same time and network with each other so they can talk to each other. This is the fastest and easiest way to orchestrate multiple containers.

If you have to orchestrate containers scattered over more guest, Kubernetes is the prevailing solution. Many hosts that support Docker deployments offer Kubernetes for orchestration.

Quick benefits of understanding Docker.

It might not seem relevant right now, but keep this information in mind for the first time you run into a problem caused by differences in development environments. You don't want it to happen again. By learning to use Docker, you will be able to ensure a consistent environment for your application, regardless of where it runs or who is managing it. That means consistent, reliable results that you, your clients and your employers can count on.

Understanding Docker offers a number of immediate benefits or, as we call them in the business world, “quick wins”. Let's look at some of the more significant ones:

  1. Consistent development environment: Docker allows you to easily create and deploy uniform development environments across different machines and different platforms. This significantly reduces problems related to differences in development environments, a common problem in many development teams.
  2. Application portability: With Docker, your applications can be easily moved from one environment to another without compatibility issues. This means your applications can be developed locally, tested on a staging environment, then moved to production without any changes to the environment.
  3. Application isolation: Docker allows you to run applications in isolated containers, ensuring they don't interfere with each other. This can be especially useful when working with applications that require different versions of the same dependencies.
  4. Resource efficiency: Docker containers are notoriously resource efficient, using only the resources needed to run the application they contain. This can lead to greater resource efficiency, particularly when working with resource constrained machines.
  5. Replicability: With Docker, application building and deployment processes are fully automated and replicable. This means that each team member can run the application in exactly the same way, eliminating the issues of differences in local configurations.

In short, understanding and using Docker can lead to a number of immediate benefits, making it a valuable tool for any developer.

Do you have doubts? Don't know where to start? Contact us!

We have all the answers to your questions to help you make the right choice.

Chat with us

Chat directly with our presales support.

0256569681

Contact us by phone during office hours 9:30 - 19:30

Contact us online

Open a request directly in the contact area.

INFORMATION

Managed Server Srl is a leading Italian player in providing advanced GNU/Linux system solutions oriented towards high performance. With a low-cost and predictable subscription model, we ensure that our customers have access to advanced technologies in hosting, dedicated servers and cloud services. In addition to this, we offer systems consultancy on Linux systems and specialized maintenance in DBMS, IT Security, Cloud and much more. We stand out for our expertise in hosting leading Open Source CMS such as WordPress, WooCommerce, Drupal, Prestashop, Joomla, OpenCart and Magento, supported by a high-level support and consultancy service suitable for Public Administration, SMEs and any size.

Red Hat, Inc. owns the rights to Red Hat®, RHEL®, RedHat Linux®, and CentOS®; AlmaLinux™ is a trademark of AlmaLinux OS Foundation; Rocky Linux® is a registered trademark of the Rocky Linux Foundation; SUSE® is a registered trademark of SUSE LLC; Canonical Ltd. owns the rights to Ubuntu®; Software in the Public Interest, Inc. holds the rights to Debian®; Linus Torvalds holds the rights to Linux®; FreeBSD® is a registered trademark of The FreeBSD Foundation; NetBSD® is a registered trademark of The NetBSD Foundation; OpenBSD® is a registered trademark of Theo de Raadt. Oracle Corporation owns the rights to Oracle®, MySQL®, and MyRocks®; Percona® is a registered trademark of Percona LLC; MariaDB® is a registered trademark of MariaDB Corporation Ab; REDIS® is a registered trademark of Redis Labs Ltd. F5 Networks, Inc. owns the rights to NGINX® and NGINX Plus®; Varnish® is a registered trademark of Varnish Software AB. Adobe Inc. holds the rights to Magento®; PrestaShop® is a registered trademark of PrestaShop SA; OpenCart® is a registered trademark of OpenCart Limited. Automattic Inc. owns the rights to WordPress®, WooCommerce®, and JetPack®; Open Source Matters, Inc. owns the rights to Joomla®; Dries Buytaert holds the rights to Drupal®. Amazon Web Services, Inc. holds the rights to AWS®; Google LLC holds the rights to Google Cloud™ and Chrome™; Microsoft Corporation holds the rights to Microsoft®, Azure®, and Internet Explorer®; Mozilla Foundation owns the rights to Firefox®. Apache® is a registered trademark of The Apache Software Foundation; PHP® is a registered trademark of the PHP Group. CloudFlare® is a registered trademark of Cloudflare, Inc.; NETSCOUT® is a registered trademark of NETSCOUT Systems Inc.; ElasticSearch®, LogStash®, and Kibana® are registered trademarks of Elastic NV Hetzner Online GmbH owns the rights to Hetzner®; OVHcloud is a registered trademark of OVH Groupe SAS; cPanel®, LLC owns the rights to cPanel®; Plesk® is a registered trademark of Plesk International GmbH; Facebook, Inc. owns the rights to Facebook®. This site is not affiliated, sponsored or otherwise associated with any of the entities mentioned above and does not represent any of these entities in any way. All rights to the brands and product names mentioned are the property of their respective copyright holders. Any other trademarks mentioned belong to their registrants. MANAGED SERVER® is a trademark registered at European level by MANAGED SERVER SRL, Via Enzo Ferrari, 9, 62012 Civitanova Marche (MC), Italy.

Back to top