
Just pack it
- 11 minsBuilding a WordPress Deployment Pipeline with Packer, Ansible, and GitHub Actions
Infrastructure as Code (IaC) is a game-changer for automating server deployments. HashiCorp’s Packer application is one of the best tools to build infrastructure in the cloud. While working at one of the biggest sneaker retail giants in the world I worked on a project that involved cleaning up and updating a multi-chain packer build. In this blog post, I decied to recreate the multichain Packer build pipeline that creates two AWS AMIs and explain it. Since I cant give away trade secrets I can rebuild a new version of the pipeline and stretch my memory and skills to share I guess. The pipeline builds a base image with WordPress manually installed on Ubuntu 20.04 using Ansible, and an enhanced image that builds on the first by adding WordPress plugins and security packages. The glue that put it together at work was Jenkins, but because I no longer work there and dislike managing Jenkins jobs and Jenkins in general, I’ll test and automate the entire process with a GitHub Actions workflow that chains the builds and passes the base AMI ID to the enhanced build.
Why Packer
Packer allows you to create consistent, immutable machine images across platforms like AWS, Azure, and VirtualBox. For WordPress, this means you can pre-bake AMIs with all dependencies—web server, database, PHP, and WordPress itself allowing you to create a fast, scalable, and reliable deployment in your environments. By chaining builds, you can create a modular pipeline: a base image with the core setup and an enhanced image with customizations like plugins and security configurations. This allows developers to work on the base image and tweak it how they see fit, or tweak an enhanced image to ensure security and vulnerability tests are met. If you had AMI instances in different regions of the world this could allow you to add translation plugins or GDPR secuirty testing to an enhanced image and allow development in those European like areas. Packer essentially “PACKS” all the files, packages, filesystems, and everything into a reusable image to build cloud servers from.
Project Goals
The goal is to create two AMIs:
-
- Base Image: An Ubuntu 20.04 AMI with Nginx, MariaDB, PHP, and WordPress installed and configured using Ansible.
-
- Enhanced Image: An AMI built from the base image, adding WordPress plugins (e.g., Yoast SEO, Akismet) and security packages (e.g., UFW, Fail2Ban, ClamAV) via Ansible.
-
- Use GitHub Actions workflow to
- Build the base image.
- Extract the resulting AMI ID.
- Pass the AMI ID to the enhanced image build.
- Store build logs as artifacts for debugging.
- Use GitHub Actions workflow to
The File Structure
To follow along you can clone this repository
├── base-image
│ ├── ansible
│ │ ├── files
│ │ │ ├── nginx.conf.j2
│ │ │ ├── wordpress.sql
│ │ │ └── wp-config.php.j2
│ │ ├── playbook.yml
│ │ └── scripts
│ │ └── install_wordpress.sh
│ ├── packer-base.json
│ └── scripts
│ └── bootstrap.sh
├── enhanced-build
│ ├── ansible
│ │ ├── files
│ │ │ └── secure-wordpress.sh
│ │ ├── playbook.yml
│ │ └── scripts
│ │ └── install_wordpress.sh
│ ├── packer-enhanced.json
│ └── scripts
│ └── bootstrap.sh
├── gh-actions-packer-build.yml
└── README.md
Base Image with WordPress
The base image sets up a fully functional WordPress installation. Here’s a breakdown of the key components.
Packer Template: base-image/packer-base.json`
This template uses the amazon-ebs builder to create an AMI from an Ubuntu 20.04 base image. It runs a bootstrap script to install Ansible and then uses an Ansible provisioner to set up WordPress.
{
"variables": {
"aws_access_key": "",
"aws_secret_key": "",
"aws_region": "us-east-1",
"ami_name": "wordpress-base-",
"ssh_username": "ubuntu"
},
"builders": [
{
"type": "amazon-ebs",
"access_key": "",
"secret_key": "",
"region": "",
"instance_type": "t2.micro",
"source_ami_filter": {
"filters": {
"virtualization-type": "hvm",
"name": "ubuntu/images/*ubuntu-focal-20.04-amd64-server-*",
"root-device-type": "ebs"
},
"owners": ["099720109477"],
"most_recent": true
},
"ami_name": "",
"ssh_username": "",
"associate_public_ip_address": true,
"force_deregister": true,
"force_delete_snapshot": true
}
],
"provisioners": [
{
"type": "shell",
"script": "scripts/bootstrap.sh"
},
{
"type": "ansible",
"playbook_file": "ansible/playbook.yml",
"extra_arguments": ["--extra-vars", "db_name=wordpress db_user=wp_user db_password=securepassword"],
"ansible_env_vars": ["ANSIBLE_HOST_KEY_CHECKING=False"]
}
]
}
#### Ansible Playbook: base-image/ansible/playbook.yml
The Ansible playbook installs Nginx, MariaDB, PHP, and WordPress, configures the database, and sets up Nginx to serve the WordPress site. Key tasks include:
- Installing dependencies (nginx, mariadb-server, php-fpm, etc.).
- Creating a MySQL database and user.
- Downloading and extracting WordPress.
- Configuring wp-config.php and Nginx using Jinja2 templates.
- Setting proper permissions for the www-data user.
The Bootstrap Script: base-image/scripts/bootstrap.sh
This makes sure Ansible is available on the base image:
#!/bin/bash
set -ex
sudo apt-get update
sudo apt-get install -y software-properties-common
sudo apt-add-repository --yes --update ppa:ansible/ansible
sudo apt-get update
sudo apt-get install -y ansible
To build the base image locally, run:
cd base-image
packer build packer-base.json
This creates an AMI named wordpress-base-
Enhanced Image with Plugins and Security
The enhanced image builds on the base AMI, adding WordPress plugins and security packages.
Packer Template: enhanced-image/packer-enhanced.json
This template uses the base AMI ID as an input variable:
{
"variables": {
"aws_access_key": "",
"aws_secret_key": "",
"aws_region": "us-east-1",
"ami_name": "wordpress-enhanced-",
"ssh_username": "ubuntu",
"base_ami_id": ""
},
"builders": [
{
"type": "amazon-ebs",
"access_key": "",
"secret_key": "",
"region": "",
"instance_type": "t2.micro",
"source_ami": "",
"ami_name": "",
"ssh_username": "",
"associate_public_ip_address": true,
"force_deregister": true,
"force_delete_snapshot": true
}
],
"provisioners": [
{
"type": "ansible",
"playbook_file": "ansible/playbook-enhanced.yml",
"extra_arguments": ["--extra-vars", "wp_plugins='yoast-seo akismet'"],
"ansible_env_vars": ["ANSIBLE_HOST_KEY_CHECKING=False"]
}
]
}
Ansible Playbook: enhanced-image/ansible/playbook-enhanced.yml
This playbook installs security packages and WordPress plugins:
- Security Packages: Installs ufw, fail2ban, and clamav for firewall, intrusion prevention, and antivirus protection.
- WordPress Plugins: Uses WP-CLI to install and activate plugins like Yoast SEO and Akismet.
- Hardening: Runs a script to disable Nginx directory listing and update WordPress core.
Security Script: enhanced-image/ansible/files/secure-wordpress.sh
#!/bin/bash
set -ex
# Disable directory listing
sed -i 's/autoindex on/autoindex off/' /etc/nginx/sites-available/wordpress
# Restart Nginx
systemctl restart nginx
# Update WordPress core
wp core update --path=/var/www/html/wordpress --allow-root
To build the enhanced image locally, you’d need the base AMI ID:
cd enhanced-image
packer build -var "base_ami_id=ami-1234567890abcdef0" packer-enhanced.json
Manually passing the AMI ID is tedious. So we should automate it with GitHub Actions.
Automating with GitHub Actions
The GitHub Actions workflow (packer-build.yml
) generates the build process, chaining the base and enhanced image builds and passing the base AMI ID automatically.
Workflow: .github/workflows/packer-build.yml
name: Build Packer AMIs
on:
push:
branches:
- main
pull_request:
branches:
- main
workflow_dispatch:
jobs:
build-base-image:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up Packer
uses: hashicorp/setup-packer@v3
with:
packer_version: 1.10.0 # Adjust to the desired version
- name: Build base image
id: build_base
env:
AWS_ACCESS_KEY_ID: $
AWS_SECRET_ACCESS_KEY: $
run: |
cd base-image
packer init packer-base.json
packer build -force packer-base.json > packer-base-output.log
# Extract AMI ID from Packer output
AMI_ID=$(grep -oP 'ami-[0-9a-f]{17}' packer-base-output.log | tail -1)
echo "BASE_AMI_ID=$AMI_ID" >> $GITHUB_ENV
continue-on-error: false
- name: Upload base image log
if: always()
uses: actions/upload-artifact@v4
with:
name: packer-base-log
path: base-image/packer-base-output.log
retention-days: 5
outputs:
base_ami_id: $
build-enhanced-image:
needs: build-base-image
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up Packer
uses: hashicorp/setup-packer@v3
with:
packer_version: 1.10.0
- name: Build enhanced image
env:
AWS_ACCESS_KEY_ID: $
AWS_SECRET_ACCESS_KEY: $
BASE_AMI_ID: $
run: |
cd enhanced-image
packer init packer-enhanced.json
packer build -var "base_ami_id=$BASE_AMI_ID" -force packer-enhanced.json > packer-enhanced-output.log
- name: Upload enhanced image log
if: always()
uses: actions/upload-artifact@v4
with:
name: packer-enhanced-log
path: enhanced-image/packer-enhanced-output.log
retention-days: 5
How It All Works
- Triggers: Runs on push or pull requests to the main branch, or manually via workflow_dispatch.
- Base Image Job:
- Checks out the code and sets up Packer.
- Builds the base image and captures the output.
- Extracts the AMI ID using grep and stores it in GITHUB_ENV.
- Uploads the build log as an artifact.
- Enhanced Image Job:
- Depends on the base image job to ensure it runs after the base AMI is created.
- Uses the extracted AMI ID (BASE_AMI_ID) to build the enhanced image.
- Uploads the build log.
- Secrets: AWS credentials are stored as GitHub Secrets (AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY).
Setting Up GitHub Actions
- Add AWS credentials to your repository’s secrets (Settings > Secrets and variables > Actions > New repository secret).
- Commit the workflow file and Packer templates to your repository.
- Push to the main branch or trigger the workflow manually from the GitHub Actions tab.
Conclusion
The workflow will build both AMIs and output their IDs in the logs. Check the artifacts for detailed logs if anything goes wrong. This Packer and GitHub Actions pipeline shows the power of Infrastructure as Code. Whether you’re running a single blog, an ecommerce site, or a fleet of WordPress sites, this approach ensures consistency, scalability, and repeatability.
Check out the full code in my github repo link here.