Docker for Dummies (Part-1 Intro)

Docker for Dummies (Part-1 Intro)

Learn about ABC(s) about Docker


7 min read

Docker is containerization technology (but what does that mean? We will discuss this because I am beginner who just got his head around this topic). I will not waste time to teach you history but how-to-do(s) with docker.

Containerization is technique or software ideology that helps us to create "virtual environment(s)" with almost certain predefine behaviour**. You can read up who came up with idea here. But this idea successor of virtualization and bare metal technologies.

What does docker do for us?

Imagine I got freelance opportunity to build website for Shah Jahan's Taj Mahal. Being trusty-for-money, I accepted that task in hand. I have Lenovo's laptop with ubuntu running as primary OS. I am using Nodejs V21 and all the other latest and greatest stuff in software world. I build Nextjs in 2 months with all the crazy animations and best-UX. Then, comes our beloved maharaja (aka. king) Shah Jahan, old man wants to see the dev site and want to make some changes by himself on his PC. He got Intel Pentium PC with windows 95. WHAT? Who uses that? But you and I are tech people who constantly upgrade versions of our systems. He is not. So, now either I somehow run the dev site on his PC or I am getting hanged.

This is where Docker Team came, and said they know "how to run miniature version of Ubuntu inside/on Windows 95". They are willing take responsibility of running ubuntu in his PC without dualboot. Somehow, they did this and give me access to ubuntu instance, and after that I installed Nodejs V21, installed dependency and started the "npm run dev" command. This way, Maharaja was able to see the site in dev mode.

So, what did we learn?

Docker team is promising us if docker itself is compatible with your system, then everything inside docker world is compatible with your system.

Essential Words

Till now, you might have heard people or tried watching YouTube tutorial that throw terms like Containers, Images etc. Let's understand these "basic" terms with our Shah Jahan example. The docker images specify the method on how to run ubuntu on windows 95 and container is actually virtual machine that is running ubuntu at the end. Because container is so small like 80 Mb(s), containers are often considered as light-weight VMs.

So, formally,

Docker Images provide a template/blueprint for containers and facilitate application sharing and distribution. Whereas Containers provide a consistent and isolated environment for apps to run in.

A diagram illustrating the process of creating and running a Docker container. It starts with a Dockerfile, which is built into a Docker Image, and then the image is run to create a Docker Container.

The Dockerfile serves as a blueprint for configuring the desired environment, specifying the necessary settings and dependencies. From this Dockerfile, a Docker Image is built, containing all the components and instructions required to run the environment smoothly and are more concrete because instructions are now code. Subsequently, Docker containers are instantiated from these images, providing isolated and self-sufficient environments capable of executing programs effectively.

We will write only Dockerfile rest of the things are generated by docker tool itself.

So, Let's get our hands dirty


I personally believe if you are seriously reading this blog, then you know how to install stuff on your system. I am just giving docker engine site, they have done good job explaining/streamlining the installation process.


Tutorial on Simple Node.js Project

On my GitHub link, you will find various projects. The first one, a directory named simple-nodejs-project, contains two files: package.json and index.js. If you have Node.js installed, you can run this script using npm start. This simple script checks if your Node.js version is above 21. If it is, it logs "helloworld"; otherwise, it throws an error. This is a basic example of the environment mismatch issues we encounter regularly in software development. It's a primary use case for virtualization technology, which was later advanced by containerization technology. With virtual containers, we can create isolated environments with independent versions for every project.

// Extract the major version number from process.version
const nodeVersion = process.version; //versions looklike 22.2.0
const majorVersion = parseInt(nodeVersion.split('.')[0].replace('v', ''), 10);

// Check the version and perform the required action
if (majorVersion >= 21) {
} else {
    throw new Error('Node.js version must be 21 or higher');

Running this on Node 16 and Node 22 is manageable with NVM (Node Version Manager). However, for projects involving multiple languages like Rust and Go, using Docker is ideal. I'll set my local Node version to 16 and run the script in a container, ensuring it works even without Node.js installed locally.

A terminal window showing a series of commands and outputs related to Node.js. The user checks the Node.js version (v16.20.2), npm version (8.19.4), and attempts to start a Node.js process, which fails due to a version requirement (Node.js version must be 21 or higher). The user then switches to Node.js v22.2.0 using nvm, verifies the versions, and successfully starts the Node.js process, which outputs "helloworld".

Steps to Containerization:

  1. Make Dockerfile

  2. Write Boilerplate code for now, you will understand the config in a bit. This config is our Dockerfile, let's create a docker image out o this config with docker build -t simple-nodejs-project . .

     # Use the lightest Node.js 22 base image, predefined starting point. 
     FROM node:22-alpine
     # Set the working directory in the container
     WORKDIR /usr/src/app
     # Copy every From . (current directory oF code source in your pc) to  
     # . (current workdir in docker container i.e. /usr/src/app)
     COPY . .
     # Install dependencies
     RUN npm install
     # Set the default command to run the application
     CMD ["npm", "start"]
  3. Now, look inside Docker Desktop or just write docker images in terminal.

  4. This docker image is actually in local. Now, your job is done. The template/blueprint on which docker container will run is defined properly. Now, you need to run this container with docker run -it --rm simple-nodejs-project command or pressing the run button in Docker Desktop.

Common Docker Terminal Commands

These are some commands you will need in your toolkit. I am not suggesting you memorize them. (I copied and generated these commands from ChatGPT. xD)

CommandExplanationCommon FlagsExample
docker buildBuilds an image from a Dockerfile.-t (tag), -f (Dockerfile)docker build -t my-image .
docker runRuns a command in a new container.-d (detached), -p (publish), --name (name), -e (env)docker run -d -p 80:80 --name my-container my-image
docker psLists running containers.-a (all), -q (quiet)docker ps -a
docker stopStops one or more running containers.N/Adocker stop my-container
docker rmRemoves one or more containers.-f (force), -v (volumes)docker rm -f my-container
docker rmiRemoves one or more images.-f (force)docker rmi my-image
docker pullPulls an image or a repository from a registry.N/Adocker pull ubuntu:latest
docker pushPushes an image or repository to a registry.N/Adocker push myrepo/my-image
docker execRuns a command in a running container.-it (interactive terminal)docker exec -it my-container /bin/bash

Common Dockerfile Instructions

By no means, this list is not exhaustive. xD

WORKDIRWORKDIR /usr/src/appSets the working directory for any RUN, CMD, COPY etc .. instructions that follow it.
RUNRUN npm installExecutes any terminal commands on top of the current image and commits the results.
CMDCMD ["npm", "start"] or CMD ["npm", "run", "dev"]Run the main command with defaults.[NOTE]: There can only be one CMD instruction in a Dockerfile.
EXPOSEEXPOSE 5000Exposes a port from the docker container, so that docker runtime can bind host's port to container's port.
ENVENV PORT=8000Sets the environment variable.
COPYCOPY requirements.txt ./appAllow files from the Docker host to be added to the Docker image.

Next Part

And with that this part is done. It's first part of 3 part series. Next, part will have we will learn about port mapping, env variables, docker layers while building a simple single application express server.

Link to next part: