We may not have the course you’re looking for. If you enquire or give us a call on +31 208081674 and speak to our training experts, we may still be able to help with your training requirements.
Training Outcomes Within Your Budget!
We ensure quality, budget-alignment, and timely delivery by our expert instructors.
When it comes to developing and deploying applications, developers often face challenges like compatibility issues, dependency conflicts, and environment inconsistencies. These challenges can slow down the development process, increase the risk of errors, and reduce the quality of the product. Docker is a tool that can help overcome these issues; however, developers first need to Install Docker on Ubuntu. So, How to Install Docker on Ubuntu, and what are the steps involved?
In this blog, you will learn How to Install Docker on Ubuntu with step-by-step instructions. Let's dive in deeper to learn more about it!
Table of Contents
1) Docker and its prerequisites on Ubuntu
2) Installing Docker on Ubuntu 20.04
3) Working with Docker
4) Docker Compose: Simplifying multi-container applications
5) Uninstalling Docker
6) Conclusion
Docker and its prerequisites on Ubuntu
Docker has become a go-to solution for developers seeking to package their applications and run them in a consistent environment across various platforms. This containerisation technology ensures smoother deployment, scaling, and management of applications, making it a vital asset for modern Software Development.
Before diving into the installation process, ensure you have the following prerequisites in place:
1) A system running Ubuntu 20.04 with administrative privileges.
2) Access to the internet to download and install the required packages
Installing Docker on Ubuntu 20.04
Let's get started with the installation of Docker on your Ubuntu 20.04 system. Follow these steps carefully:
Updating the package index
First, let's update the package index to ensure we install the latest versions of software packages:
sudo apt update
Installing dependencies
Next, install the necessary dependencies to allow ‘apt’ to use packages over HTTPS:
sudo apt install apt-transport-https ca-certificates curl software-properties-common
Adding Docker repository
To install Docker, we need to add its official repository. Import the Docker ‘gpg’ key to ensure package integrity:
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /usr/sh
Installing Docker engine
Now that we have added the Docker repository, let's install the Docker engine:
sudo apt update
sudo apt install docker-ce
Starting and enabling Docker
Once the installation is complete, start the Docker service and enable it to start on boot:
sudo systemctl start docker
sudo systemctl enable docker
Verifying Docker installation
To verify that Docker is installed and running correctly, run the following command:
sudo docker --version
If Docker is installed correctly, you will see the version of Docker as well.
Unlock the power of containerisation with our comprehensive Introduction to Docker Training !
Working with Docker
Congratulations! You now have Docker installed on your Ubuntu 20.04 system. Let's explore some essential Docker commands to get you started.
Running your first container
To ensure Docker is working, let's run a simple container. The following command will pull the "hello-world" image from Docker Hub and run it as a container
sudo docker run hello-world
If everything is set up correctly, you will see a message indicating that your Docker installation is working.
Managing Docker images
Docker allows you to manage images efficiently. To list the Docker images available on your system, use the following command:
sudo docker image ls
To remove an image, use the ‘docker rmi’ command followed by the image ID or name.
Interacting with Docker containers
To view all running containers, execute:
sudo docker ps
To stop a running container, use the ‘docker stop’ command followed by the container ID or name. To remove a container, use the ‘docker rm’ command.
Building custom Docker images
One of the most powerful features of Docker is the ability to build custom images using Dockerfiles. A Dockerfile is a text file containing instructions on how to build an image. Let's create a simple Nginx web server Docker image:
1) Create a file named Dockerfile (no file extension) in a directory.
2) Add the following content to the Dockerfile:
# Use the official Nginx image as the base image
FROM nginx:latest
# Copy custom index.html to Nginx's default document root
COPY index.html /usr/share/nginx/html/
3) Create an ‘index.html’ file in the same directory with your desired content.
4) Build the Docker image using the following command:
sudo docker build -t my-nginx-image .
5)Once the image is built, you can run a container using your custom image:
sudo docker run -d -p 80:80 my-nginx-image
Now you have a Nginx web server running in a Docker container.
Docker networking
By default, Docker containers can communicate with each other through their IP addresses. However, you can also create custom networks to facilitate communication between containers.
To create a user-defined bridge network, use the following command:
sudo docker network create my-network
To run a container and attach it to the custom network, specify the ‘--network’ option:
sudo docker run -d --network my-network --name my-container nginx
This allows containers in the same network to communicate with each other using container names.
Docker volumes
Docker volumes are used to persist data beyond the lifecycle of a container. To create a volume, use the following command:
sudo docker volume create my-volume
To mount a volume inside a container, use the ‘-v’ option:
sudo docker run -d -v my-volume:/path/in/container --name my-container nginx
Interested to gain deeper knowledge about Docker, refer to our blog on "Podman vs Docker"
Docker Compose: Simplifying multi-container applications
As the complexity of modern software applications grows, managing multiple containers and their interdependencies becomes a challenging task. Docker Compose, a powerful tool provided by Docker, comes to the rescue by simplifying the management process of multi-container applications. It allows developers to define and configure entire application stacks in a single, easy-to-read YAML file, streamlining the entire deployment process.
Docker Compose is a higher-level tool built on top of Docker that enables users to define and manage multi-container applications. With Compose, developers can specify the services, networks, and volumes required for an application and define how these components interact with each other. This declarative approach eliminates the need for complex manual configurations and manual setup, saving time and reducing the chances of errors.
A few key features and benefits of Docker Compose on Ubuntu are as follows:
1) Declarative syntax: Docker Compose uses a declarative syntax that describes the desired state of the application stack. This makes it easy for developers to understand the application's structure and dependencies at a glance. The YAML configuration file can be version-controlled, enabling collaboration and sharing of application configurations across development teams.
2) Service definition: In Docker Compose, each component of the application stack is referred to as a service. Services can be based on pre-existing Docker images or custom-built images defined in Dockerfiles. The ability to define multiple services in a single Compose file makes it an excellent choice for microservices architectures and complex applications.
3) Networking and communication: Docker Compose automatically creates a default network for all services defined in the Compose file. It allows them to communicate with each other using their service names. This eliminates the need for manual network configurations, simplifying the process of connecting containers within the application stack.
4) Volume management: Docker Compose facilitates data persistence by managing volumes for the services. Developers can define named volumes in the Compose file, which are created and managed by Docker Compose. This ensures that data remains available even when containers are stopped or restarted.
5) Scalability and replication: Docker Compose makes it easy to scale services horizontally by specifying the desired number of replicas for each service. With a simple command, developers can scale up or down the number of instances of a service, allowing the application to handle varying workloads effectively.
To begin using Docker Compose, you need to have it installed on your system, which typically comes pre-installed with Docker. If not, it can be easily installed by following the official Docker documentation.
Once Docker Compose is set up, creating a Compose file is the first step. The Compose file is written in YAML format and named ‘docker-compose.yml’. Within this file, developers define the services, networks, and volumes required for the application. Here's a simple example of a Compose file for a basic web application:
version: '3' services: web: image: nginx:latest ports: - "80:80" database: image: mysql:latest environment: MYSQL_ROOT_PASSWORD: examplepassword |
In this example, we define two services: "web" and "database." The "web" service uses the latest Nginx image and maps ‘port 80’ of the host to ‘port 80’ of the container. The "database" service uses the latest MySQL image and sets the root password through an environment variable. Once the Compose file is ready, you can start the application stack by using the following command:
docker-compose up
Docker Compose will automatically pull the required images, create the necessary containers, and set up the network and volumes as specified in the Compose file. The application will then be up and running, and you can access it through the specified ports.
Docker Compose has become a valuable tool for developers working with multi-container applications. Its simplicity, declarative approach, automated management of services, etc. streamline the development, deployment, and scalability of complex applications.
By eliminating manual configurations and simplifying the container orchestration process, Docker Compose empowers developers to focus on building and delivering robust and scalable applications. Whether you are working with microservices or monolithic applications, Docker Compose is a must-have tool for your Docker toolbox.
Uninstalling Docker
If you have been using Docker on your Ubuntu system and need to remove it for any reason, the uninstallation process is quite simple. Whether you want to upgrade to a different version of Docker or no longer require it, the following steps will help you uninstall Docker from your Ubuntu-powered machine:
1) Stop Docker containers: Before proceeding with the uninstallation, ensure that all running Docker containers are stopped. Use the following command to stop all running containers:
sudo docker stop $(sudo docker ps -aq)
This command will stop all running Docker containers.
2) Remove Docker containers: After stopping the containers, you can remove them by using the following command:
sudo docker rm $(sudo docker ps -aq)
This command will remove all Docker containers from your system.
3) Remove Docker images: Next, you can remove Docker images that you no longer need. Be cautious, as removing images will result in their complete deletion from your system:
sudo docker rmi $(sudo docker images -aq)
This command will remove all Docker images from your system.
4) Uninstall Docker package: Now that you have removed the containers and images, it's time to uninstall the Docker package itself. Use the following command to uninstall Docker:
sudo apt-get purge docker-ce docker-ce-cli containerd.io
This command will remove the Docker package along with its dependencies from your Ubuntu system.
5) Remove Docker data: Docker stores its data, configuration files, and other settings in various directories. To ensure a clean uninstallation, remove these directories using the following commands:
sudo rm -rf /var/lib/docker
sudo rm -rf /etc/docker
These commands will delete Docker's data and configuration directories.
6) Prune Docker volumes: If you’ve used Docker volumes for data persistence, you might want to prune them to free up disk space.
sudo docker volume prune
This command will remove all unused Docker volumes.
Uninstalling Docker from your Ubuntu system is an easy process. It involves stopping and removing containers, images, and volumes. Additionally, it requires uninstalling the Docker package and removing Docker's data and configuration directories. By following the aforementioned steps, you can ensure the clean removal of Docker from your system.
It’ll leave you with a clean slate or allow you to install a different version of Docker as per your requirements. Remember to create a backup of any important data and configurations before proceeding with the uninstallation to prevent accidental data loss.
Conclusion
In conclusion, Docker provides a powerful and efficient set of solutions for containerising applications and simplifying the development process. By now, you must’ve developed a solid understanding of How to Install Docker on Ubuntu and use it to deploy and manage containers. Embrace the world of containerisation to enhance your development workflow and make your applications more portable and scalable.
Take our Certified DevOps Professional (CDOP) - United Kingdom and elevate your expertise in seamless Software Development.
Frequently Asked Questions
One way to check if Docker is installed on Ubuntu is to use the command docker -v in the terminal. This will show the Docker version if it is installed or an error message if it is not.
For example: $ docker -v Docker version 20.10.11, build dea9396
Another way to check if Docker is installed on Ubuntu is to use the command systemctl status docker in the terminal. This will show the status of the Docker service if it is installed and running, or an error message if it is not.
There are different ways to install Docker in the command line, depending on your operating system and preferences. Here are some examples:
On Ubuntu, you can install Docker from the official Docker repository by following these steps:
1) Update your existing list of packages with sudo apt update
2) Install a few prerequisite packages that let apt use packages over HTTPS with sudo apt install apt-transport-https ca-certificates curl software-properties-common
3) Add the GPG key for the official Docker repository to your system with curl -fsSL [https://download.docker.com/linux/ubuntu/gpg](https://docs.docker.com/engine/install/) | sudo apt-key add -
4) Add the Docker repository to APT sources with sudo add-apt-repository "deb [arch=amd64] [https://download.docker.com/linux/ubuntu](https://docs.docker.com/engine/reference/commandline/cli/) focal stable"
5) Update the package database with the Docker packages from the newly added repo with sudo apt update
6) Install Docker with sudo apt install docker-ce
On Windows 10, you can install Docker Desktop from the Microsoft Store by following these steps:
1) Open the Microsoft Store app and search for Docker Desktop
2) Click on the Get button and wait for the download to complete
3) Click on the Install button and follow the instructions on the screen
4) Launch Docker Desktop from the Start menu or the desktop shortcu
The Knowledge Academy takes global learning to new heights, offering over 30,000 online courses across 490+ locations in 220 countries. This expansive reach ensures accessibility and convenience for learners worldwide.
Alongside our diverse Online Course Catalogue, encompassing 17 major categories, we go the extra mile by providing a plethora of free educational Online Resources like News updates, blogs, videos, webinars, and interview questions. Tailoring learning experiences further, professionals can maximise value with customisable Course Bundles of TKA.
The Knowledge Academy’s Knowledge Pass, a prepaid voucher, adds another layer of flexibility, allowing course bookings over a 12-month period. Join us on a journey where education knows no bounds.
The Knowledge Academy offers various DevOps Courses, including Certified DevOps Professional (CDOP), Certified Agile DevOps Professional (CADOP) and Kubernetes Training. These courses cater to different skill levels, providing comprehensive insights into DevOps methodologies.
Our DevOps blogs cover a range of topics related to PRINCE2, offering valuable resources, best practices, and industry insights. Whether you are a beginner or looking to advance your DevOps skills, The Knowledge Academy's diverse courses and informative blogs have you covered.
Upcoming Programming & DevOps Resources Batches & Dates
Date
Fri 14th Feb 2025
Fri 11th Apr 2025
Fri 13th Jun 2025
Fri 15th Aug 2025
Fri 10th Oct 2025
Fri 12th Dec 2025