How to Install Docker

How to Install Docker: A Complete Step-by-Step Guide for Developers and DevOps Teams Docker has revolutionized the way software is developed, tested, and deployed. By enabling containerization, Docker allows developers to package applications with all their dependencies into standardized units called containers. These containers run consistently across different environments — from a developer’s l

Nov 6, 2025 - 10:06
Nov 6, 2025 - 10:06
 4

How to Install Docker: A Complete Step-by-Step Guide for Developers and DevOps Teams

Docker has revolutionized the way software is developed, tested, and deployed. By enabling containerization, Docker allows developers to package applications with all their dependencies into standardized units called containers. These containers run consistently across different environments from a developers laptop to production servers eliminating the infamous it works on my machine problem. Whether you're a beginner learning modern DevOps practices or a seasoned engineer optimizing infrastructure, installing Docker correctly is the essential first step toward building scalable, portable, and efficient applications.

This comprehensive guide walks you through every aspect of installing Docker on major operating systems, including Windows, macOS, and Linux distributions like Ubuntu, CentOS, and Debian. Beyond installation, we cover best practices, essential tools, real-world use cases, and frequently asked questions to ensure you not only install Docker successfully but also configure it securely and efficiently for production-ready workflows.

Step-by-Step Guide

Installing Docker on Ubuntu 22.04 / 20.04

Ubuntu is one of the most popular Linux distributions for development and server environments. Installing Docker on Ubuntu involves updating system packages, adding Dockers official repository, and installing the Docker Engine.

Begin by opening a terminal and ensuring your system is up to date:

sudo apt update && sudo apt upgrade -y

Next, install required packages to allow apt to use repositories over HTTPS:

sudo apt install apt-transport-https ca-certificates curl software-properties-common -y

Add Dockers official GPG key to verify package integrity:

curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg

Add the Docker repository to your systems source list:

echo "deb [arch=amd64 signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null

Update the package index again to include Dockers repository:

sudo apt update

Now install Docker Engine, CLI, and Containerd:

sudo apt install docker-ce docker-ce-cli containerd.io -y

Once installation completes, verify Docker is running:

sudo systemctl status docker

You should see output indicating that the Docker service is active and running. If not, start it manually:

sudo systemctl start docker

To enable Docker to start automatically on boot:

sudo systemctl enable docker

Installing Docker on CentOS / RHEL 8 / 9

CentOS and RHEL are widely used in enterprise environments. Docker installation on these systems follows a similar pattern but uses the yum or dnf package manager.

First, remove any older Docker installations (if present):

sudo yum remove docker docker-client docker-client-latest docker-common docker-latest docker-latest-logrotate docker-logrotate docker-engine -y

Install required dependencies:

sudo yum install -y yum-utils

Add the Docker repository:

sudo yum-config-manager --add-repo https://download.docker.com/linux/centos/docker-ce.repo

Install Docker Engine:

sudo yum install docker-ce docker-ce-cli containerd.io -y

For RHEL 9 or CentOS Stream, use dnf instead:

sudo dnf install docker-ce docker-ce-cli containerd.io -y

Start and enable the Docker service:

sudo systemctl start docker

sudo systemctl enable docker

Verify the installation:

sudo docker --version

You should see output similar to: Docker version 24.0.7, build afdd53b

Installing Docker on Debian 12 / 11

Debian is known for its stability and is commonly used in production servers. The installation process closely mirrors Ubuntus.

Update your package list and install prerequisites:

sudo apt update

sudo apt install apt-transport-https ca-certificates curl gnupg lsb-release -y

Add Dockers GPG key:

curl -fsSL https://download.docker.com/linux/debian/gpg | sudo gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg

Add the repository:

echo "deb [arch=amd64 signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/debian $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null

Update and install Docker:

sudo apt update

sudo apt install docker-ce docker-ce-cli containerd.io -y

Start and enable Docker:

sudo systemctl start docker

sudo systemctl enable docker

Installing Docker on macOS

On macOS, Docker Desktop is the recommended and most user-friendly way to install Docker. It includes Docker Engine, Docker CLI, Docker Compose, and Kubernetes.

Visit the official Docker website: https://www.docker.com/products/docker-desktop

Download the latest version of Docker Desktop for Mac (Intel or Apple Silicon). Once the .dmg file downloads:

  1. Open the file and drag the Docker icon into the Applications folder.
  2. Launch Docker Desktop from your Applications folder.
  3. Follow the on-screen prompts to complete installation.

Docker Desktop will automatically configure the necessary components. Youll see a Docker whale icon in your menu bar once its running.

To verify the installation, open Terminal and run:

docker --version

Also test with a simple container:

docker run hello-world

If you see a welcome message from Docker, the installation was successful.

Installing Docker on Windows 10 / 11

On Windows, Docker Desktop is the standard installation method. It requires Windows 10 Pro, Enterprise, or Education (64-bit) with Hyper-V and WSL 2 enabled.

First, ensure WSL 2 is installed:

  • Open PowerShell as Administrator and run:
wsl --install

This command installs WSL 2 and Ubuntu by default. If you already have WSL installed, ensure its version 2:

wsl -l -v

If Ubuntu is version 1, upgrade it:

wsl --set-version Ubuntu 2

Next, enable Hyper-V if not already active:

  • Go to Control Panel > Programs > Turn Windows features on or off.
  • Check Hyper-V and Windows Subsystem for Linux.
  • Restart your computer.

Download Docker Desktop for Windows from: https://www.docker.com/products/docker-desktop

Run the installer and follow the prompts. After installation, Docker Desktop will launch automatically. Youll see the Docker whale icon in your system tray.

Open Command Prompt or PowerShell and verify:

docker --version

Test with:

docker run hello-world

Post-Installation Setup: Adding User to Docker Group

By default, Docker requires root privileges to run. Running Docker commands with sudo every time is inconvenient and can be a security risk if not managed properly.

To allow your user account to run Docker commands without sudo, add your user to the docker group:

sudo usermod -aG docker $USER

Log out and log back in, or run:

newgrp docker

Test without sudo:

docker run hello-world

If the container runs successfully, your user now has proper permissions.

Best Practices

Use Official Images

Always prefer official Docker images from Docker Hub (e.g., nginx, redis, python) over third-party images. Official images are maintained by the software vendors, regularly updated for security patches, and scanned for vulnerabilities. You can identify them by the absence of a username prefix for example, library/nginx is official, while johnsmith/nginx is user-created.

Minimize Image Size

Large Docker images increase build times, consume more bandwidth, and expand the attack surface. Use multi-stage builds to separate build-time dependencies from runtime environments. For example, when building a Node.js application:

FROM node:18-alpine AS builder

WORKDIR /app

COPY package*.json ./

RUN npm ci --only=production

COPY . .

RUN npm run build

FROM node:18-alpine

WORKDIR /app

COPY --from=builder /app/node_modules ./node_modules

COPY --from=builder /app/dist ./dist

EXPOSE 3000

CMD ["node", "dist/index.js"]

This approach ensures the final image contains only whats needed to run the app, not build tools like npm or TypeScript compilers.

Dont Run Containers as Root

Running containers as the root user inside the container poses a serious security risk. If an attacker exploits a vulnerability in your application, they gain root access to the container and potentially the host system.

Use the USER directive in your Dockerfile to switch to a non-root user:

FROM node:18-alpine

RUN addgroup -g 1001 -S nodejs

RUN adduser -u 1001 -S nodejs -sh /bin/bash

USER nodejs

WORKDIR /app

COPY --chown=nodejs:nodejs . .

CMD ["node", "server.js"]

Regularly Scan for Vulnerabilities

Docker images may contain outdated packages with known security flaws. Use tools like Docker Scout (built into Docker Desktop) or Trivy to scan images for vulnerabilities:

trivy image nginx:latest

Integrate scanning into your CI/CD pipeline to block deployments of vulnerable images.

Use .dockerignore Files

Just as you use .gitignore to exclude files from version control, use a .dockerignore file to prevent unnecessary files from being copied into your Docker image. This improves build speed and reduces image size.

Example .dockerignore:

.git

node_modules

npm-debug.log

.env

README.md

Limit Resource Usage

Containers can consume excessive CPU or memory if left unbounded. Use Dockers resource constraints during runtime:

docker run -d --name myapp \

--memory=512m \

--cpus=1.0 \

nginx:latest

In production, use orchestration tools like Docker Swarm or Kubernetes to enforce resource limits across clusters.

Tag Images Properly

Use semantic versioning for your Docker images. Avoid using :latest in production. Instead, tag with version numbers or Git commit hashes:

docker build -t myapp:v1.2.3 .

docker push myregistry.com/myapp:v1.2.3

This ensures reproducible deployments and makes rollbacks possible.

Tools and Resources

Docker Desktop

Docker Desktop is the most comprehensive tool for local development. It provides a graphical interface, built-in Kubernetes, Docker Compose, and easy access to Docker Hub. Its ideal for macOS and Windows users. For Linux users, Docker Engine is sufficient, but Docker Desktop is also available for advanced features.

Docker Compose

Docker Compose allows you to define and run multi-container applications using a single YAML file. Its indispensable for applications with databases, caches, and microservices.

Example docker-compose.yml:

version: '3.8'

services:

web:

build: .

ports:

- "5000:5000"

depends_on:

- redis

redis:

image: redis:alpine

Run with:

docker-compose up

Docker Hub

Docker Hub is the largest public registry of Docker images. It hosts over 100,000 official and community images. You can push your own images here for sharing or pull pre-built images for quick deployment.

Sign up at https://hub.docker.com and authenticate via CLI:

docker login

Portainer

Portainer is a lightweight, open-source GUI for managing Docker environments. It simplifies container, volume, network, and image management through a web interface. Install it with:

docker run -d -p 9000:9000 --name=portainer \

--restart=always \

-v /var/run/docker.sock:/var/run/docker.sock \

-v portainer_data:/data \

portainer/portainer-ce:latest

Access it at http://localhost:9000.

Trivy

Trivy is an open-source vulnerability scanner for containers. It detects OS package vulnerabilities, misconfigurations, and secrets in Docker images. Install it via Homebrew on macOS:

brew install aquasecurity/trivy/trivy

Or download the binary for Linux/Windows from https://github.com/aquasecurity/trivy.

Docker Scout

Docker Scout is a proprietary tool integrated into Docker Desktop that provides real-time security insights, dependency analysis, and compliance reports. Its ideal for teams prioritizing DevSecOps practices.

Visual Studio Code + Docker Extension

The official Docker extension for VS Code allows you to browse containers, inspect images, view logs, and edit Dockerfiles directly in your editor. It integrates seamlessly with Docker Compose and remote development workflows.

Online Resources

Real Examples

Example 1: Deploying a Python Flask App

Lets containerize a simple Python Flask application.

Create a directory and file structure:

myflaskapp/

??? app.py

??? requirements.txt

??? Dockerfile

Contents of app.py:

from flask import Flask

app = Flask(__name__)

@app.route('/')

def hello():

return "Hello, Docker World!"

if __name__ == '__main__':

app.run(host='0.0.0.0', port=5000)

Contents of requirements.txt:

Flask==2.3.3

Contents of Dockerfile:

FROM python:3.11-slim

WORKDIR /app

COPY requirements.txt .

RUN pip install --no-cache-dir -r requirements.txt

COPY . .

EXPOSE 5000

CMD ["gunicorn", "--bind", "0.0.0.0:5000", "--workers", "1", "app:app"]

Build and run:

docker build -t flask-app .

docker run -p 5000:5000 flask-app

Visit http://localhost:5000 to see your app running in a container.

Example 2: WordPress with MySQL

Use Docker Compose to run a full WordPress site with a MySQL database.

Create docker-compose.yml:

version: '3.8'

services:

db:

image: mysql:8.0

volumes:

- db_data:/var/lib/mysql

environment:

MYSQL_ROOT_PASSWORD: example

MYSQL_DATABASE: wordpress

MYSQL_USER: wordpress

MYSQL_PASSWORD: wordpress

restart: always

wordpress:

image: wordpress:latest

ports:

- "8000:80"

environment:

WORDPRESS_DB_HOST: db:3306

WORDPRESS_DB_USER: wordpress

WORDPRESS_DB_PASSWORD: wordpress

WORDPRESS_DB_NAME: wordpress

volumes:

- wp_data:/var/www/html

restart: always

volumes:

db_data:

wp_data:

Run:

docker-compose up -d

Wait a few moments, then visit http://localhost:8000 to complete WordPress setup. This setup is production-ready with persistent volumes and automatic restarts.

Example 3: CI/CD Pipeline with GitHub Actions

Automate Docker builds and pushes using GitHub Actions.

Create .github/workflows/docker-build.yml:

name: Build and Push Docker Image

on:

push:

branches: [ main ]

jobs:

build:

runs-on: ubuntu-latest

steps:

- uses: actions/checkout@v4

- name: Login to Docker Hub

uses: docker/login-action@v3

with:

username: ${{ secrets.DOCKER_USERNAME }}

password: ${{ secrets.DOCKER_PASSWORD }}

- name: Build and Push

uses: docker/build-push-action@v5

with:

context: .

file: ./Dockerfile

push: true

tags: myusername/myapp:latest

This workflow automatically builds and pushes your image to Docker Hub on every push to the main branch, enabling continuous delivery.

FAQs

Is Docker free to use?

Yes, Docker Community Edition (CE) is free for personal and commercial use. Docker Desktop is free for small businesses, personal use, and education. Enterprises with more than 250 employees or over $10 million in annual revenue must purchase a Docker Business subscription for advanced features and support.

Whats the difference between Docker and virtual machines?

Docker containers share the host operating systems kernel, making them lightweight and fast to start. Virtual machines (VMs) emulate an entire operating system, requiring more memory and CPU. Containers are ideal for microservices and application portability, while VMs are better for running multiple OSes or legacy applications.

Can I run Docker on a Mac with Apple Silicon (M1/M2)?

Yes. Docker Desktop for Mac supports Apple Silicon natively. The latest versions use ARM64-based images and offer improved performance over Intel emulation. Ensure youre using Docker Desktop 3.3 or later.

Why do I get permission denied when running Docker commands?

This error occurs when your user isnt part of the docker group. Fix it by running: sudo usermod -aG docker $USER, then log out and back in. Alternatively, always prefix commands with sudo, but this is not recommended for regular use.

How do I remove Docker completely?

On Linux, remove packages and clean up:

sudo apt remove docker-ce docker-ce-cli containerd.io

sudo rm -rf /var/lib/docker

sudo rm -rf /var/lib/containerd

On macOS, drag Docker Desktop to the Trash and run:

rm -rf ~/Library/Group\ Containers/group.com.docker

rm -rf ~/.docker

On Windows, use Add or Remove Programs to uninstall Docker Desktop, then delete C:\ProgramData\Docker manually if it remains.

How do I update Docker?

On Linux, update the package list and upgrade:

sudo apt update

sudo apt upgrade docker-ce docker-ce-cli containerd.io

On macOS and Windows, Docker Desktop will notify you of updates. Click Update and Restart in the app.

Can I run Docker in a virtual machine?

Yes. Docker can run inside VMs, but nested virtualization must be enabled in the hypervisor (e.g., VMware, VirtualBox, Hyper-V). Performance may be slightly reduced, but its useful for testing or when you cant install Docker directly on the host.

What is the difference between Docker Engine and Docker Desktop?

Docker Engine is the core container runtime used on Linux servers. Docker Desktop is a full application for macOS and Windows that includes Docker Engine, Docker Compose, Kubernetes, and a GUI. Linux users typically install Docker Engine directly, while macOS/Windows users benefit from Docker Desktops integrated experience.

How do I check which containers are running?

Use:

docker ps

To see all containers (including stopped ones):

docker ps -a

Can Docker be used in production?

Absolutely. Docker is used by companies like Spotify, Netflix, Shopify, and PayPal in production at scale. When combined with orchestration tools like Kubernetes, Docker provides reliability, scalability, and rapid deployment cycles essential for modern cloud-native applications.

Conclusion

Installing Docker is more than a technical task its the gateway to modern software development and deployment. By following the steps outlined in this guide, youve equipped yourself with the knowledge to install Docker securely and efficiently across multiple platforms. From Ubuntu servers to macOS laptops and Windows workstations, Dockers cross-platform consistency ensures your applications behave the same everywhere.

But installation is only the beginning. Adopting best practices such as using minimal images, avoiding root users, scanning for vulnerabilities, and tagging releases properly transforms Docker from a convenient tool into a robust, secure, and scalable foundation for your projects. Real-world examples demonstrate how Docker simplifies complex setups like WordPress deployments and CI/CD pipelines, proving its value beyond development environments.

As you continue your journey, explore Docker Compose for multi-container applications, integrate scanning tools like Trivy into your workflow, and consider Portainer for visual management. Stay updated with Dockers evolving ecosystem, and dont hesitate to leverage the vast community resources available.

Docker has become an industry standard for a reason: it solves real problems. By mastering its installation and foundational practices, youre not just learning a tool youre adopting a philosophy of portability, efficiency, and automation that defines the future of software delivery. Start small, build confidence, and soon youll be deploying complex applications with a single command.