Containerizing a Multi-tier Application with Docker: A Step-by-Step Guide

CodeGenitor
8 min readMar 7, 2024

--

Containerization has become a cornerstone in modern software development, offering benefits like scalability, consistency, and ease of deployment. In this post, we’ll explore how to containerize a multi-tier application using Docker, from setting up the environment to deploying the application.

Prerequisites:

Before diving into the tutorial, ensure you have a basic understanding of Docker and containerization concepts. You’ll also need Node.js and npm installed on your machine, as well as a React application setup for the frontend and Node.js for the backend.

And if you’re scratching your head thinking, ‘But where do I find these mythical applications?’ Don’t worry, I’ve got your back. It’s like searching for buried treasure, except instead of gold doubloons, you’ll find code gems.”

Grab Your Virtual Spade: Forking My Repo on GitHub for Fun and Code!

Now, before we dive into the coding circus, make sure you’ve got your GitHub gear ready. Think of it as your backstage pass to the developer world. If you don’t have it yet, no worries, just head over to GitHub and set up shop.

Once you’re all set, it’s time to fork my repo. No, not literally with a fork and knife, although that would make for an interesting coding session. Just head to my repo, hit that ‘Fork’ button like you’re in a pancake-eating contest, and voila! You’ve got your very own copy of the code to play with. It’s like getting your own personal sandbox — just remember to share your toys (or commits) with the rest of us!
SOURCE CODE: Multi-tier Application Containerization

Step 1: Project Folder HackYourBrain📁

Alrighty, let’s get organized! Think of this step like tidying up your digital workspace before diving into the coding chaos. Whether you’ve forked the repo or you’re ready to containerize your masterpiece, here’s what your project folder should look like:

Ingredients:

  • Your trusty code editor (pick your flavor!)
  • A sprinkle of Git magic
  • A dollop of determination

Instructions:

  1. Fork & Clone: If you’ve forked the repo and cloned it onto your local machine, give yourself a pat on the back! If not, no worries — just follow the GitHub fork and clone dance to bring that code home:
    git clone https://github.com/yourgithubname/projectname.git
  2. Project Prep: Now, let’s dive into your project folder. It’s like opening your toolbox before a big project — everything you need should be neatly organized and ready to go.
  3. Inspect & Adapt: Take a quick peek inside your project folder. Does everything look shipshape? Double-check for any stray files or missing ingredients.

Quick Tip:
Remember, like a magician guarding their best trick, never reveal your .env file! as you can see in my project folder. Now that I have shared mine with you, don’t you think it will be a good idea to share yours with me.

With your project folder primed and ready, you’re all set to tackle the next step in your coding adventure! 🚀

Step 2: Setting up the React Application & Docker file! 🚀

Everything should be ready if you already fork and clone the project. Alright, time to level up your React game and get those containers spinning! It’s like preparing for a double feature — first, we’ll set up your React application, then we’ll whip up a Dockerfile to containerize it. Ready? Let’s go!

Ingredients:
- React (the star of the show!)
- Docker (your backstage pass to deployment magic)
- A sprinkle of terminal wizardry

Instructions:

1. **React Setup**: Fire up your code editor and navigate to your project folder. Think of this as setting the stage for your React masterpiece. Once you’re there, let’s create some React magic:
```
npx create-react-app yourprojectname
```
This command will conjure up a brand new React application named “yourprojectname”. Feel free to give it a name that suits your project!

2. **Docker Delight**: With your React app ready to rock, it’s time to containerize it with Docker. But first, we need to create a Dockerfile. Think of it as the blueprint for your container:
```
touch Dockerfile.frontend
```
This command creates a new empty file named “Dockerfile” in your project folder(frontend).

3. **Dockerfile file**: Open up your Dockerfile in your code editor and let’s get cooking! We’ll add some ingredients to tell Docker how to build your container. Here’s a basic recipe to get you started:
```
# Stage 1: Build React app

FROM node:20-alpine AS builder

# Create app directory

WORKDIR /HackYourBrain-Frontend

# Copy package.json and package-lock.json to the working directory

COPY package*.json ./

# Install dependencies

RUN npm install

# Copy the rest of the application code to the working directory

COPY . .

# Build the app

RUN npm run build

# Stage 2: Set up NGINX server

FROM nginx:alpine

# Remove the default NGINX configuration file

RUN rm -rf /usr/share/nginx/html/*

# Copy the build files from the previous stage to the NGINX web server directory

COPY --from=builder /HackYourBrain-Frontend/build /usr/share/nginx/html

# Expose port 80 to allow external access

EXPOSE 80

# Command to start NGINX when the container starts

CMD ["nginx", "-g", "daemon off;"]

```
Save your Dockerfile when you’re done. This sets up a Node.js and nginx environment, installs dependencies, and runs your React app.

Step 3: Node.js app & Docker file 🐳

Alright, time to add some Node.js magic to our project and wrap it up in a Docker package! Imagine you’re crafting a digital potion — Node.js provides the spell, and Docker wraps it up in a magical container.

Ingredients:
- Node.js (the wizardry behind your backend)
- Docker (your trusty containerization cauldron)
- A sprinkle of command-line wizardry

Instructions:
1. **Node.js Charm**: First things first, let’s set up our Node.js application. Think of it as laying the foundation for your magical castle. Create your backend files, set up your routes, and sprinkle in any necessary dependencies.

2. **Docker Conjuring**: Now, let’s containerize that backend magic with Docker. It’s like bottling up your potion for easy transport! Create a Dockerfile in your project directory and add the instructions to build your Node.js container.

Here’s a basic Dockerfile recipe to get you started in our case:
```
FROM node:20-alpine

# install nodemon

RUN npm install -g nodemon

#install pm2

RUN npm install pm2 -g

# Create app directory

WORKDIR /HackYourBrain-Backend

# Install app dependencies

COPY package*.json ./

# install dependencies

RUN npm install

# Bundle app source

COPY . .

# build the app

RUN npm run build

EXPOSE 3005

CMD [“npm”, “run”, “dev”]
```

With Node.js and Docker working together in harmony, your project is one step closer to becoming a digital masterpiece! 🚀

Step 4: Setting up PostgreSQL and Dockerizing Your Database 🐘🐳

Alright, let’s get that PostgreSQL database up and running and then containerize it with Docker for easy management and deployment.

Ingredients:
- PostgreSQL (the robust relational database)
- Docker (your containerization companion)
- A pinch of configuration

Instructions:
1. **PostgreSQL Setup**: First, let’s set up PostgreSQL. You can install it locally on your machine or use a cloud service. Once installed, create a database and any necessary tables and schemas. Make sure to note down the connection details (host, port, username, password) for later use.

2. **Dockerization **: Now, let’s containerize PostgreSQL using Docker. This will allow us to manage the database environment more efficiently. Create a Dockerfile in your project directory to define the PostgreSQL container.

Here’s a basic Dockerfile recipe to get you started:
```
FROM postgres:latest

# Set the environment variables

ENV POSTGRES_DB=${POSTGRES_DB}

ENV POSTGRES_USER=${POSTGRES_USER}

ENV POSTGRES_PASSWORD=${POSTGRES_PASSWORD}

# Copy the database schema to the container

COPY schema.sql /docker-entrypoint-initdb.d/

COPY seed.sql /docker-entrypoint-initdb.d/

# Expose the port

EXPOSE 5432

```

Step 5: Docker Compose Configuration for Multi-Service Applications 🐳

Now, let’s streamline the management of your multi-service application using Docker Compose. This tool allows you to define and run multi-container Docker applications with ease.

Ingredients:
- Docker Compose (your orchestration maestro)
- Services configuration (for each component of your application)
- Networking setup (to facilitate communication between services)

Instructions:
1. **Create Docker Compose File**: Start by creating a `docker-compose.yml` file in your project directory. This YAML file will define the services, networks, and volumes for your application.

2. **Service Definitions**: Define each service in your application within the `services` section of the `docker-compose.yml` file. This includes specifying the Docker image, environment variables, ports, and any other configuration options.

3. **Networking Setup**: Define the networks that will connect your services together. Docker Compose automatically creates a default network for your services, but you can also define custom networks for more advanced configurations.

4. **Volumes Configuration**: If your services require persistent storage, define volumes to store data outside of the container filesystem. This ensures that your data persists even if the container is stopped or removed.

5. **Environment Variables and Secrets**: Use environment variables to pass configuration values to your services. Docker Compose also supports secrets management for sensitive information like passwords and API keys.

6. **Build and Run**: Once your `docker-compose.yml` file is configured, use the `docker-compose up` command to build and run your multi-service application. Docker Compose will create and start all the containers defined in the file, orchestrating the entire application stack.

7. **Monitoring and Management**: Docker Compose provides commands for managing your multi-service application, such as starting, stopping, and restarting containers. You can also monitor logs and view container status using Docker Compose commands.

With Docker Compose, managing multi-service applications becomes a breeze, allowing you to focus on building and deploying your application with confidence. Happy orchestrating! 🚀

Conclusion:

In conclusion, I hope this setup gets you started and orchestrating your multi-service applications like a maestro leading a symphony orchestra. If this guide has been as helpful as a trusty sidekick in a superhero movie, I’d love to hear about your experience. Share your thoughts, comments, and maybe even your favorite meme with me. After all, laughter is the best medicine, especially when debugging code! And hey, if this guide didn’t quite hit the mark, let me know your thoughts anyway. As they say, even the best bomb sometimes — it’s all part of the learning process. Remember, in the world of coding, knowledge is best shared, just like pizza at a hackathon. So, don’t be a stranger, and if you found this guide as entertaining as a cat video on a Monday morning, consider following me for more tech content. Thank you for reading, and may your code be as elegant as a well-crafted pun. Happy orchestrating, and may the bugs be ever in your favor! 🚀

--

--

CodeGenitor
CodeGenitor

Written by CodeGenitor

Software developer passionate about coding, innovation, and tech trends. Turning ideas into reality, one line of code at a time.

No responses yet