Docker Disk Usage Investigation

Docker Disk Usage Investigation - docker system df / Logs / Volumes

What You'll Learn

  • How to identify what's consuming disk space in Docker environments
  • Use docker system df to determine if images/containers/volumes are the cause
  • Safe verification steps before any deletion

💡 Quick Summary

Fastest approach to diagnose Docker disk growth:

  1. Check which partition is full with df -h
  2. Check Docker's total usage with docker system df
  3. Get detailed breakdown with docker system df -v
  4. Suspect either log bloat (container logs) or volume growth

Note: This article focuses on "identification" - deletion requires caution.

Table of Contents

  1. Check Disk Status (df)
  2. Check Docker Usage
  3. Locate the Data
  4. Check Container Log Bloat
  5. Identify the Culprit Container
  6. Check Volume Growth
  7. Safety Precautions

⚠️ Prerequisites

  • OS: Ubuntu
  • Docker installed
  • Permissions: User with docker access or sudo

1. First: Check If Disk Is Actually Full (df)

Before blaming Docker, verify what's full:

$ df -h

Common Patterns

  • / is full (Docker data often lives under root)
  • /var is a separate partition that's full (/var/lib/docker effect)

2. Check How Much Docker Is Using

Get an overview first:

$ docker system df

What to Look For

  • Images: Are images piling up?
  • Containers: Are stopped containers accumulating?
  • Local Volumes: Are volumes growing?
  • Build Cache: Is build cache accumulating?

💡 Just knowing "which is big" makes the next investigation much easier.

3. Locate What's Growing

Docker data typically lives here (may vary by setup):

  • Docker data: /var/lib/docker
  • Container logs: /var/lib/docker/containers/<container-id>/*.log

Check the size of /var/lib/docker:

$ sudo du -h -d 1 /var/lib/docker | sort -h

💡 If /var/lib/docker is large, Docker is likely the cause.

4. Check for Container Log Bloat

Docker stores container stdout/stderr as logs by default. Continuous logging causes growth here.

4-1. Check containers directory size

$ sudo du -h -d 2 /var/lib/docker/containers | sort -h | tail -n 20

4-2. Find large log files (top 20)

$ sudo find /var/lib/docker/containers -type f -name "*.log" -printf "%s %p\n" | sort -n | tail -n 20

💡 If you find huge *-json.log files, log bloat is the likely culprit.

5. Identify Which Container Is the Cause

Map the container ID (long string in path) to container name:

5-1. List containers (ID and name)

$ docker ps -a --format "table {{.ID}}\t{{.Names}}\t{{.Status}}\t{{.Image}}"

5-2. Get details for a specific container ID

$ docker inspect <container-id> --format '{{.Name}}'

Note: .Name may have a leading /.

6. Check Volume Growth

DB data, WordPress uploads, caches etc. tend to accumulate in volumes.

6-1. List volumes

$ docker volume ls

6-2. See which container uses which volume

$ docker ps -a --format "table {{.Names}}\t{{.Mounts}}"

💡 This shows which containers are using which volumes.

7. Safety Precautions (Important)

❌ Don't run docker system prune immediately

It may delete more than you expect.

❌ Be careful with volumes

Volumes often contain actual data (DB/WordPress etc.). Recovery is difficult if deleted.

⚠️ Log bloat requires root cause analysis

If the cause of logging persists, the problem will recur.

Mitigation Approaches

If log bloat is the cause, typical next steps include:

  • Configure log rotation (Docker logging driver / log-opts with limits)
  • Review application log volume (error storms, debug logs)
  • Forward necessary logs to a separate system

💡 Deletion is a temporary fix - stopping the cause provides long-term stability.

Verification

After investigation or mitigation, verify improvement:

$ df -h
$ docker system df

📋 Test Environment

Commands tested on Ubuntu 24.04 LTS / Docker 24.x.

Next Reading