Use AWS CI Tools to Sync a Static Website on S3 with a Git Repository

With CodeCommit, CodePipeline, CodeBuild, and S3

James Grunewald
3 min readOct 30, 2018
“person pointing at blue sticky note on wall” by rawpixel on Unsplash

Recently I’ve been working on a side project with two distinct components. One component consists of a React App which I host in an S3 bucket and use AWS CloudFront to distribute. It’s the epitome of hosting a modern web application in a serverless manner. I’m quite proud!

As a modern developer obsessed with automation I grew tired of running “npm run build” and then syncing the build directory with the S3 bucket. So I thought to myself, “I should automate this!”. I looked at the AWS console and saw the pipeline tool, the build tool, and I was already using CodeCommit (because it lets me have private repos for free). So it seemed this should be rather straight forward. Long story short, it was not.

I learned you can’t simply have a CodeBuild project that automatically runs after a change to the repository. This seems like an obvious oversight, and oddly you can do this if you integrate with GitHub (perhaps other services but I did not confirm). You can however have a pipeline that uses your CodeCommit repository as its source and then in another stage runs the CodeBuild project.

Then I learned how quirky artifacts are in CodeBuild. Let’s assume you’re using GitHub to host your repository. In that case, you assume you can just let the CodeBuild project run and add an S3 bucket as the target. Well for some reason, the bucket will always end up with a folder at the root containing the files you specified. You can rename the file through CodeBuild, but there’s no way to get rid of it. This of course is not ideal for hosting a static website from an S3 bucket. We want things in the root most the time.

It also seems you are unable to do this with the pipeline scenario. If you use CodeBuild with a pipeline you must set the artifact target to be the CodePipeline. So you can really only feed the resulting artifact into other stages not to an S3 bucket. This is great if you want to do integration testing or deploy to ElasticBeanstalk, but terrible for our use case.

At this point I’m reading forums and confirming everything I’ve discovered is real and the features don’t exist. One proposed solution was to set a trigger on the build project to launch a lambda, but honestly it seemed like too much effort so I never considered it. Then I discovered the AWS CLI is deployed on the managed images. So I thought I can just add a command to sync with the S3 bucket.

That almost worked, but there was one caveat. You need to find the role for the CodeBuild project you are using and give it permission to add stuff to your S3 bucket. Once you enable that, you can sync any directories you want and say where you want them in the S3 bucket. In my case it was important (meaning I didn’t want to reconfigure anything) for the files to be located in the root directory of the bucket.

Here’s a summary of the steps:

  • Create a CodePipeline (only necessary if using CodeCommit)
  • Create a CodeBuild project (You’ll be prompted while creating the pipeline)
  • Edit the IAM role used by the CodeBuild Project to have S3 permissions
  • Add the AWS S3 commands necessary to sync to the bucket

Here’s an example build specification for CodeBuild:

version: 0.2

phases:
install:
commands:
- echo Installing Packages with NPM
- npm install
build:
commands:
- echo Build started on `date`
- npm run build
post_build:
commands:
- echo Build completed on `date`
- echo Uploading contents to S3
- aws s3 sync build/ s3://<bucket name here and path>

And that’s it folks. Now you can simply push your changes and brag to your friends about your full CD pipeline! Leave me a comment with feedback, thanks, or disappointment!

--

--

James Grunewald

Husband, Father, Programmer, Hacker, Bug Bounty Hunter, Video Game Player. My opinions are my own. https://www.linkedin.com/in/jbgrunewald