Basics Of Docker
Docker is a container provider, It is used to containerize your project.
What do we mean by containers or containerize, Some Big Techie Words.
Don’t worry we will get along with these easy.

To understand docker in a single line “ Docker Provide Us With A Virtual Environment Which Narrows The Gap Between Development And Production Environment.”
What Are Containers ?
Before understanding containers, we should start with something at the bottom of docker hierarchy. ‘ Images ’ (NOT THAT JEPG,PNG etc.), Basically images are platform independent versions of your project, that means you can execute your project anywhere on any machine irrespective of it’s environment, This what docker does, you don’t need to worry about the machine or the dependencies, Docker does it for you.
Conclusion: Images are platform independent version of my project.
You Can find various images on : (Docker Hub)
Cool, With These Images, “Let’s See What Are Containers.”
Containers:
Containers are the coolest concept i have ever known, these are similar to virtual machine but share a lot of differences, Now that is confusing. Let’s See How These Work!
Basically, containers provide your project with the environment it needs to execute, A container holds the following things implicitly:
- Operating System (Implicit)
- Dependencies (Explicit)
- Your Project (Explicit)
- Environment Variables (Explicit)
- And Many Other Things Required To Run Your Project.
Now, How containers are different from VM’s, Basically containers require a docker daemon which provides them with the operating system to run your projects, Every container on your machine shares the same OS unlike VM’s which uses ‘‘HYPER VISOR’’ to manage different OS for different VM’s. Moreover, containers take only that much computing power it needs to run your program unlike VM’s. But containers work the same way any virtual machine would work. Any project running in a container won’t affect the environment of your machine, container are completely isolated.
Conclusion: Containers are virtual machine which share the same ‘‘Operation System’’ being isolated from each other and normal computing environment of your machine. Yet provides everything needed. Containers are the future of virtualization.
How Docker Creates The Environment ?
For docker to understand the requirements of a the project like what dependencies to install and which file to execute and which command to use for starting the program. We design a ‘Dockerfile’, in that ‘Dokcerfile’ everything is stated, like which dependency to install which command to start the execution etc.Example: I have a python program which i want to dockerize so that anyone in my school batch can execute that program irrespective of python being installed in their machines. So what I’ll do is I will create a python program and then write a Dockerfile specifying to use python environment and other important things like which modules to install etc.
Below is a Dockerfile:

‘RUN’ is used to specify a command to execute, like here i am using ‘RUN’ to execute ‘pip install’, ‘ -r ’ is used to read from a file as here ‘requirements.txt’. ‘CMD’ is used to give the start command, like here ‘python’ specifies the command and ‘app.py’ specifies the file. What is that first line ‘FROM python:2.7-slim’ this is the image name which will set the python environment in the container. After colon ‘:’ ‘’2.7-slim’’ is the tag or say version of the python on ‘’hub.docker.com’’, ‘WOKRDIR’ is used to specify the working directory in the container, ‘ADD’ is used to add files in the folder to the working directory, ‘EXPOSE’ is used to set the port if it is web program, ‘ENV’ is used to specify the environment variable.
Implementing Containers:
Now, that we know about containers, How to use them?
Let’s use ‘‘Node.js’’ project to implement containers on,
Step 1 : Download and install Docker (Community Edition)
( Docker Platform )
Step 2: Create any node.js app, I’ll be using express.js (Framework of node.js) for my project.
Step 3: Create a Docker File.

Build Your App:
1). Make sure docker daemon is installed and running.
2). Switch to terminal, Change the directory to app directory where Dockerfile is saved.
3). Type ‘ docker build -t <image name> . ’.
-t is used tag the image with a name, . is use to specify build all the files.4). Type ‘ docker images ’, Check for the name you gave.
5). Type ‘ docker run -p 80:3003 <image name> ’ , here -p is used to map port to the port which is given in the dockerfile.
6). Visit localhost:<port>,for me <port> is 80, so even localhost will work for me.
In This Dockerfile, I have added package.json file first and then RUN is used before adding all the files of the app, This is because docker build implicitly checks for cached files, this means if there is no change from my last build of the image it will directly copy the cached file instead of re building the image. So, here if I add all the files before running ‘npm install ’ then if there is any change even in the ‘Template File’ it will rebuild the whole image, To aviod this redundant building package.json file is added first as it is rarely changed.
Step 4: Good To Go, All Set To Run Our First Dockerized Node App.
1). Type ‘ docker run -p 80:3003 <image name>’ , here -p is used to map port to the port which is given in the dockerfile.
2). Visit localhost:<port>,for me <port> is 80, so even localhost will work for me.
3). Type ‘ docker ps ’, you will have a container with running status.
4). To Stop Type, ‘docker stop <container id>’.
5). To Remove container, type ‘ docker rm <container id>’.
Hands On Docker Compose:
Now, Any app without a database connectivity is no app, So, now how to support our dockerized node app run both database and app environment simultaneously? This is where Docker Compose come into picture.
Now, How docker compose works, Docker compose divide environments into services and when and app is set to running, All the services declared in docker compose run simultaneously.
What is Docker Compose?
It is a file like Dockerfile but different in structure, here you can define as many services you want, Like for my app I require ‘Node.js’ as well as ‘MongoDb’ environment, so instead of running both oh these separately, I will declare them in a Docker Compose file and then deploy the app.
This is a docker compose file:

Here you can see, We have version to define the version of the docker-compose.yml, ‘Services’ to define the various environments or containers we’ll be needing for our app. Here, ‘node’ service is going to build the directory with the help of ‘Dockerfile’ and will be running on port 80 mapped to port 3000, Same goes for service ‘mongo’ which will require image monog with tag latest, which will work on port 27017 mapped to port 27017.
NOTE:
Either you can build your own image or use image from the docker hub.
How To Run docker-compose.yml?
Step 1: Move to the directory which contains docker-compose.yml.
Step 2: Type ‘docker-compose up’ .
Step 3: Type ‘docker-compose ps’, to check if both the node and mongo containers are running.
Step 4: Visit the web ‘localhost/<route>’. In my case it is ‘localhost/test’.
Want This Project Visit: github/akshitgrover. (Open For Contribution)
Good Luck, With Docker.
Go Ahead ‘Dockerize’ your node app.
✉️ Subscribe to CodeBurst’s once-weekly Email Blast, 🐦 Follow CodeBurst on Twitter, view 🗺️ The 2018 Web Developer Roadmap, and 🕸️ Learn Full Stack Web Development.