Docker has become a vital tool to streamline the software development process. It offers a seamless way to package the applications and their dependencies. But the turning point is Docker in Docker. It may sound a tad bit techy but it could be a game-changer in certain scenarios. In this blog, we will dive into why and how you can run Docker in Docker. We will provide three tried and tested methods to run docker in docker.
Why Run Docker in a Docker
Let’s start off with what is docker. It is an open-source tool that helps you to create packages of your application and its dependencies in the form of a container, ensuring smooth deployment of the application.
Running Docker in Docker is a huge game changer. Here’s why run Docker in Docker.
CI/CD Pipelines: Running Docker in Docker could be beneficial if you use CI/CD systems like GitLab, Jenkins CI or GitHub Actions because you might need to build and push Docker images as part of your pipelines. Here is when running docker in Docker comes into the picture. This allows you to do that without installing Docker on CI/CD agents or using separate Virtual Machines.
Portability: Docker in Docker setup is highly portable and can be shared easily as it comes in the form of packages, making the Dockerized app self-contained and portable.
Sandboxed Environments: Docker in a Docker container as a sandbox comes in handy when you want to experiment with the Docker command and test without changing the host system. In this way, you can separate your experiment from your actual environment and avoid future errors and conflicts.
How to run Docker in Docker
According to Stack Overflow, Docker has become a vital tool for a developer with 69% of professionals this as their priority tool. There are essentially 3 methods to run Docker in Docker:
Mount the Host’s Docker Socket
In this method we will mount the host’s docker socket so that the Docker CLI present in the container can collaborate with the host’s Docker daemon and carry out the command. The first step for this method is to run the container with “-v /var/run/docker.sock:/var/run/docker.sock” option. The command will be as follows:
docker run -v /var/run/docker.sock:/var/run/docker.sock -ti docker:18.06
The command will run the container with the original Docker image from Docker Hub that has Docker CLI installed in it. Any other image with a Docker binary can also work, otherwise, you will have to install it by yourself. You can log into the docker with the following command:
docker exec -it <CONTAINER_ID> /bin/sh
Now you are inside the container, you can use any Docker command. You can Nginx command with the following command:
docker container run -d --name=sock-nginx nginx
To verify if the container has been created, you can use the following command:
docker ps
After running he above command you will see the following output:
To check the version of the Docker binary installed, use the following command:
Docker --version
You will see the following output:
This method is easy to use as it doesn’t need add-on privileges or configurations. But this gives container the full access to the host’s docker daemons which can be a huge security risk. This also creates confusion as the container created by the inner Docker will stay in the host system with the outer container, making it prone to conflicts.
Docker in Docker with DinD
In the second method we will use dind image which stands for “Docker in Docker”. This type of image provides a built-in installed Docker daemon. It can work independently on the host’s daemon so the results of docker psin the container might differ from that present in the docker ps on the host. For Docker in Docker with DinD, you will have to run a container with the –privileged For instance:
This command runs in the container using the dind image and creates environment variables and volumes for TLS authentication. You can log into the container with the following command:
docker exec -it /bin/sh
Once logged in, you can run any command as you wish. This method for Docker in Docker is beneficial as it creates a separate space for inner and outer Docker daemons and their containers. This avoids confusion and conflicts that may arise. However, like the mounting method, this method also gives the container complete access to the host’s system. This method can also create issues with the Linux Security Module and with its storage.
Nestybox sysbox Docker runtime
Nestybox sysbox runtime is a container that makes the container lightweight to make it act as a Virtual Machine. In this method, the container can run software like Kubernetes, Docker, systemd, etc without needing privileged mode or special configurations. The first step for this method is to install Nestybox sysbox on the host system. Then run the container with the –runtime=sysbox-runc For instance:
docker run --runtime=sysbox-runc --name sysbox-docker -d docker:dind
This command will start a docker dind container with a sysbox runtime. You can log into the container using the following command:
docker exec -it sysbox-docker /bin/sh
Once you are logged in the container, you can run any docker command. You can check the working of the container with the following command:
docker container run -d --name=sysbox-nginx nginx
Next, run the following command:
docker ps
The output will be:
This is an agile and secure method to run docker in docker without utilising privileged mode and special configurations. However, this method can be time-consuming and length as it requires installing and configuring a new runtime on the host system.
Conclusion
Wrapping up, we can conclude that in docker in docker can be a very clever trick which can be a great solution for the specific challenges in the world of software development. Whether its about the proper working of CI/CD processes, packaging and portability of apps or setting up an isolated test environment, dind has a role to play. The three methods we explored offer different paths to achieve this: leveraging the host’s Docker socket, diving into Docker-in-Docker with dind, or using Nestybox sysbox.. In this exploration, we unveiled how docker in docker can push the boundaries of how we build, deploy, and manage applications. Setting up and installing systems can be a tedious task. You always rely on the best DevOps consulting in Toronto to help you get started with the process instead of wasting time on installing and setting up software.