Building a Powerful CI/CD Pipeline with GitHub Actions — A Complete Guide
A CI/CD pipeline is like a virtual assistant for your code — testing, building, and deploying changes automatically. With GitHub Actions, you can set up this seamless process directly in GitHub, turning hours of manual work into an instant process. In this guide, I’ll show you how to set up a CI/CD pipeline using one of my projects as an example, so you can follow along and build your own!
Understanding the Basics of a CI/CD Pipeline
In simple terms, a CI/CD pipeline is a workflow that automates testing, building, and deploying your code. CI (Continuous Integration) means every time you make changes, your code is automatically tested to catch errors early. CD (Continuous Delivery or Deployment) takes it a step further by making sure every tested change can go live.
By setting up a CI/CD pipeline, you’re not only saving time but ensuring your code is always in tip-top shape and ready to go live at any moment. This can be a lifesaver for both solo projects and team collaborations, where fast feedback and reliable releases are essential.
Getting Started with GitHub Actions for CI/CD
Let’s dive into setting up your first pipeline using GitHub Actions, GitHub’s built-in automation tool. Think of GitHub Actions as a way to tell GitHub, “When something changes in my code, I want you to do this, then this, then this…”
Step 1: Fork and Clone my repository on Github
To get started, fork and clone my project repository on GitHub. If you don’t already have a project to work with, no worries! You can use my sample project as a starting point. Just click here to fork and clone it, and you’ll be ready to follow along with each step of this tutorial.
Step 2: Create the Workflow File for GitHub Actions
Now, let’s set up the core of your CI/CD pipeline! In the root directory of your project, create a YAML file located at: .github/workflows/main.yml
. This file will define the steps GitHub Actions will take to automate your workflow.
You can create this file directly, or use the terminal with this command:
mkdir -p .github/workflows && touch .github/workflows/main.yml
Step 3: Define the Workflow Steps in main.yml
Now that you have your workflow file set up, let’s define the actions GitHub will take whenever there’s an update to your code.
Open the .github/workflows/main.yml
file and add the following code:
name: Stripe Payments CI/CD Pipeline
on:
push:
branches:
- main # Set a branch to watch for changes
paths:
- "backend/**"
- "frontend/**"
- ".github/workflows/main.yml"
pull_request:
branches:
- main # Set a branch to watch for changes
jobs:
frontend-CI:
runs-on: ubuntu-latest
steps:
## Load the source code ##
- name: Checkout the code
uses: actions/checkout@v4
## Setup Node.js ##
- name: Setup node
uses: actions/setup-node@v4
with:
node-version: "20"
## Install dependencies ##
- name: Install dependencies
run: |
npm install -g yarn
cd frontend
yarn install
## Run tests ##
- name: Run tests
run: |
cd frontend
yarn test
## Code Quality ##
- name: Code Quality
run: |
cd frontend
yarn lint
Step 4: Managing Secrets in GitHub Actions
In CI/CD, we often need sensitive data like API keys, database passwords, or SSH keys to connect to external services. GitHub Actions makes it easy to manage these securely through Secrets.
Here’s how to add secrets to your GitHub repository:
- Navigate to Repository Settings: Go to your GitHub repository, then click on Settings > Secrets and variables > Actions > New repository secret.
· Add Secrets: For example, to deploy to AWS, add AWS_ACCESS_KEY_ID
and AWS_SECRET_ACCESS_KEY
here. Similarly, add S3_Bucket.
Step 5: Deploying to Amazon S3 with GitHub Actions
With GitHub Actions configured and secrets set up, we’re ready to move forward with deployment to Amazon S3. Deploying your project to S3 means your static site is hosted in a reliable, globally accessible environment that can handle high traffic effortlessly. Let’s use the following code in our GitHub Actions workflow to automate this deployment.
Here’s how to set up your workflow to deploy to S3:
name: Stripe Payments CI/CD Pipeline
on:
push:
branches:
- main
paths:
- "backend/**"
- "frontend/**"
- ".github/workflows/main.yml"
pull_request:
branches:
- main
jobs:
frontend-CI:
runs-on: ubuntu-latest
steps:
## Checkout the code from the repository ##
- name: Checkout the code
uses: actions/checkout@v4
## Setup Node.js ##
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: "20"
## Install dependencies ##
- name: Install dependencies
run: |
cd frontend
yarn install --frozen-lockfile
## Run tests ##
- name: Run tests
run: |
cd frontend
yarn test
## Code Quality ##
- name: Code Quality
run: |
cd frontend
yarn lint
## Deploy the frontend to S3 ##
Deploy-Frontend:
runs-on: ubuntu-latest
needs: frontend-CI
steps:
## Checkout the code from the repository ##
- name: Checkout the code
uses: actions/checkout@v4
## Setup Node.js ##
- name: Install dependencies
run: |
cd frontend
yarn install --frozen-lockfile
## Build the frontend ##
- name: Build frontend
run: |
cd frontend
yarn build
## Deploy to S3 ##
- name: Deploy to S3
uses: jakejarvis/s3-sync-action@master
with:
args: --follow-symlinks --delete
env:
AWS_S3_BUCKET: ${{ secrets.S3_BUCKET }}
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_REGION: "us-west-1"
SOURCE_DIR: "frontend/build"
Step 6: Configure S3 Policy to Allow Read Access from Anywhere
To make your static site or assets accessible to anyone on the web, you need to set up a bucket policy in Amazon S3. Since Object Ownership is set to “Bucket owner enforced,” we use bucket policies instead of ACLs to manage access permissions.
A bucket policy lets you control who can access the objects in your S3 bucket, such as granting public read access. This is especially useful if you’re hosting a static website or making assets (like images, CSS, or JavaScript files) publicly accessible.
How to Set Up a Public Read-Only Policy
- Navigate to Your Bucket’s Permissions: Go to the S3 Console, select your bucket, and then click on the Permissions tab.
- Edit the Bucket Policy: Scroll to Bucket policy and click Edit.
- Add a Public Read Policy: Copy and paste the following policy, replacing
your-bucket-name
with the name of your bucket. This policy will allow anyone to read objects in your bucket.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::your-bucket-name/*"
}
]
}
Save the Policy: Click Save to apply the new permissions.
What This Policy Does
- “Effect”: “Allow”: Grants permission for the specified actions.
- “Principal”: “*”: Allows access to anyone, making it public.
- “Action”: “s3”: Grants read-only access to the objects in the bucket.
- “Resource”: “arn:aws:s3:::your-bucket-name/*”: Applies this rule to all objects in the specified bucket.
Pro-Tip: Optimizing Pipeline Performance with GitHub Actions Cache
One of the best ways to speed up your CI/CD pipeline is by using caching. Caching allows GitHub Actions to store reusable files, like dependencies, so that they don’t need to be reinstalled every time the workflow runs. This is especially useful for dependencies in Node.js projects, where package installation can take up a lot of time. By caching your Node modules, you’ll see a faster workflow and a more efficient use of resources.
How Does Caching Work?
GitHub Actions offers a built-in cache action, actions/cache
, that lets you store files between workflow runs. When the cache action is added to a job, GitHub checks if there is an existing cache available for the specified key:
- If a cache is available, it’s restored, speeding up tasks like dependency installation.
- If there’s no cache or if the cache is outdated, GitHub creates a new one after the job runs, making it available for the next workflow run.
Setting Up Cache for Node Modules in GitHub Actions
Let’s configure caching for the Node modules used in our frontend project. This will ensure that each workflow run reuses cached dependencies, saving valuable time.
Here’s how it’s set up in our workflow:
- Specify the Path: Use
frontend/node_modules
as the path to cache. - Set a Unique Cache Key: Create a cache key based on the operating system and a hash of the
yarn.lock
file. This way, the cache updates only whenyarn.lock
changes, ensuring that any updates to dependencies are captured. - Define Restore Keys: Use restore keys as a fallback in case an exact cache match isn’t available. This allows GitHub to use partial matches, which is useful for general dependencies across environments.
Example Code
Here’s how caching is applied in the workflow file:
name: Stripe Payments CI/CD Pipeline
on:
push:
branches:
- main
paths:
- "backend/**"
- "frontend/**"
- ".github/workflows/main.yml"
pull_request:
branches:
- main
jobs:
frontend-CI:
runs-on: ubuntu-latest
steps:
## Checkout the code from the repository ##
- name: Checkout the code
uses: actions/checkout@v4
## Setup Node.js ##
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: "20"
## Cache Node modules ##
- name: Cache Node modules
uses: actions/cache@v3
with:
path: frontend/node_modules
key: ${{ runner.os }}-node-${{ hashFiles('**/frontend/yarn.lock') }}
restore-keys: |
${{ runner.os }}-node-
## Install dependencies ##
- name: Install dependencies
run: |
cd frontend
yarn install --frozen-lockfile
## Run tests ##
- name: Run tests
run: |
cd frontend
yarn test
## Code Quality ##
- name: Code Quality
run: |
cd frontend
yarn lint
## Deploy the frontend to S3 ##
Deploy-Frontend:
runs-on: ubuntu-latest
needs: frontend-CI
steps:
## Checkout the code from the repository ##
- name: Checkout the code
uses: actions/checkout@v4
## Cache Node modules ##
- name: Cache Node modules
uses: actions/cache@v3
with:
path: frontend/node_modules
key: ${{ runner.os }}-node-${{ hashFiles('**/frontend/yarn.lock') }}
restore-keys: |
${{ runner.os }}-node-
## Setup Node.js ##
- name: Install dependencies
run: |
cd frontend
yarn install --frozen-lockfile
## Build the frontend ##
- name: Build frontend
run: |
cd frontend
yarn build
## Deploy to S3 ##
- name: Deploy to S3
uses: jakejarvis/s3-sync-action@master
with:
args: --follow-symlinks --delete
env:
AWS_S3_BUCKET: ${{ secrets.S3_BUCKET }}
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_REGION: "us-west-1"
SOURCE_DIR: "frontend/build"
Benefits of Caching
- Reduces Workflow Time: By reusing cached dependencies, the workflow skips over the lengthy
yarn install
step in most cases, saving time. - Improves Efficiency: Caching makes your CI/CD pipeline leaner, using fewer resources to complete builds.
- Scales Well for Frequent Builds: For projects with regular updates and testing, caching reduces redundant work, enhancing productivity.
Conclusion
And that’s it! You’ve just optimized your CI/CD pipeline with caching, making it faster and smoother. Each time you push code, GitHub Actions will keep things running efficiently, freeing you up to focus on the exciting parts of development instead of waiting for dependencies to install.
If this guide helped, consider following me for more tips and guides that simplify tech, keep things running smoothly, and make automation work for you!