blog cover image

This is a guide to setting up and deploying your static site with Hugo on AWS infrastructure. I’ll explain how to set up an automated deployment with AWS CodePipeline using a CloudFormation template, and an AWS Lambda function to generate our site and deploy it to S3.

I’ve dedicated a section about costs as well to show what kind of potential savings you could realize using a static site on AWS. See 7. What does it cost? to get an idea. The short story, you can run this static site with a decent amount of traffic for about $1.00 / month.

We’ll cover the following topics:

1. Let’s get started

First we need an Amazon Web Services account, go to Amazon Web Services and sign up for the 1 year free tier account if you haven’t already.

Also check out my GitHub repository alongside this guide at https://github.com/rpstreef/static-site-generator

Next up, let’s look at a brief overview of what we’ll be building and which AWS services are used and how they are connected.

AWS Static Site setup

  • AWS CodeCommit, git repository for your source code.
  • AWS Lambda, runs code on amazon managed servers.
  • AWS S3, hosting on durable file storage.
    • Source code bucket: once the automation process runs, the master branch from your CodeCommit repo will be copied to this bucket.
    • Site bucket: contains the generated website deployed by the Lambda function.
  • AWS CodePipeline, our workflow defined in a CloudFormation template.
  • AWS CloudFormation, infrastructure as code. Deploys our Lambda function among other things.
  • AWS Route53, (optional) domain name setup.
  • AWS CloudWatch, Lambda function execution logs can be found here for this stack.

2. Setup AWS S3 buckets

First, let’s set up the website S3 bucket. Login to the AWS Console and head on over to S3:

  • Create your website bucket, choose a name in the format of the URL you will host it on. For instance, ‘blog.fxaugury.com’.
  • Make sure in the bucket properties the bucket is set to website hosting and that CORS is enabled.
    • CORS will make sure we can access same origin content.
  • Check permissions on your website S3 bucket, click on your website S3 bucket go to Permissions > view table “Manage public permissions”.
    • Make sure the “Everyone” group is set to read permissions for both “Object access” and “Permissions Access”.

Please note: When testing your website S3 bucket, if the index.html does not display. The likely cause is that the index.html (and possible other files as well) file is not set to public.

3. Setup AWS Route53 DNS

Now that the S3 bucket for your static site has been setup, we can (optionally) connect it to our domain via Route53. Follow the steps outlined on Amazon to register a domain OR transfer an existing domain to Amazon. Once that is done, we need to adapt the hosted zone for the domain you want to use.

Go to Route53, select your domain name you want to use for the static site. Create a new CNAME record set:

  • Value: Set the S3 bucket endpoint URL.
    • It’s the URL that contains the string; “s3-website”. To find the endpoint, go to your website bucket > properties > Static website hosting > “Endpoint”
  • Alias: No
  • TTL: 300 seconds, default propagation is sufficient. Unless you’re migrating then you probably want to lower this number to avoid downtime.
  • Routing Policy: Simple
  • Name: The domain name you would like to use instead, e.g. blog.fxaugury.com
  • Save record set.

Done! After 300 seconds the changes should have propagated and you can use the domain name that you have registered.

4. Setup AWS CodeCommit

To setup CodeCommit, there are two steps that need to be executed:

  1. Setup local git environment with Hugo and the SSH parameters.
  2. Setup and connect to the AWS CodeCommit git repo.

First order of business, our local GIT repository and Hugo:

  • Install Hugo, hugo new site <site name>.
  • In your Hugo site directory, init your git if you haven’t already, git init.

Second, setup the SSH parameters. These instructions for ssh-keygen terminal application are for Linux based machines:

  • Open up a terminal and create your ssh key via ssh-keygen, follow the onscreen instructions.
  • To use your SSH Key, go to AWS Identity and Access Management (IAM) > Users > ‘User name’ > tab “Security credentials” > “SSH keys for AWS CodeDeploy”:
    • Copy paste the public key here.
    • Create config file in ~/.ssh/config
    • Paste below, where User is your SSH key ID you just created in IAM
Host git-codecommit.*.amazonaws.com
User APKAEIBAERJR2EXAMPLE
IdentityFile ~/.ssh/codecommit_rsa
  • Change permssions of the “Config” file just created
chmod 600 config

Ok the local setup is done for now. Head on over to AWS CodeCommit.

  • In the AWS console, go to CodeCommit > Create repository.
  • Creation of the repository is done, no we need to test the SSH connection to this repositry
ssh git-codecommit.us-east-2.amazonaws.com

Add this server to known server list.

The CodeCommit git url depends on where your CodeCommit git is created from, in my case it’s created in Oregon (region: us-east-2)

We can succesfully connect via SSH to AWS CodeCommit. Now to add our local repository data to AWS CodeCommit.

  • Go to “Code” then “Connect”, copy paste the url where it says “clone”. Instead of cloning we’ll push our local git to this remote:
  • In our local git repo directory, enter:
git remote add <repo name> ssh://git-codecommit.us-west-2.amazonaws.com/v1/repos/<repo name>
  • now we’ve added the remote repository, next thing is to push it.
  • Before you push it, make sure you have removed any git repo trace from the theme submodule in your Hugo themes folder. It will be recognized as such because each theme will have a .git directory.
  • Remove the .git directory, clear the cache and add the theme directory to our repo:
rm -r <theme directory path>/.git
git rm -r --cached <theme directory path>
git add <theme directory path>
git commit -am 'theme added'

We can now push our local git changes to CodeCommit.

git push --set-upstream <repo name> master

We should see all the code from our local git repository in AWS CodeCommit repository. For subsequent changes we can just use git push to push our changes to CodeCommit master branch.

5. AWS Lambda: Static site Generator

So we have setup an S3 bucket for your website, a domain name (optionally), and a local git and CodeCommit repo.

Next we’ll review the static site generation program code briefly and upload the code as a zip file to a new S3 bucket.

To review the commented code, see: https://github.com/rpstreef/static-site-generator/blob/master/generate_static_site.py

CodeCommit source code pulled to our CodeCommit S3 bucket is encrypted, to be able to read it in our Lambda function we need to use signature version 4 for the S3 API to handle KMS encrypted files.

When you view these objects in your AWS S3 CodePipeline bucket, you’ll see “Encryption AWS-KMS” in its properties and you wont be able to view it’s contents.

The following two lines after the boto3 python sdk import will allow the application to read the contents of the files:

import boto3
from botocore.client import Config

S3 = boto3.client('s3', config=Config(signature_version='s3v4'))

Let’s deploy the code and the hugo binary to a new S3 bucket:

  • Create a new bucket, any name will suffice, preferably in the same region as your CodePipeline.
  • Upload a zip file in this new bucket containing two files:
    1. The hugo executable binary: hugo
    2. Generate static site program: generate_static_site.py

We’ll need to enter these details when we execute the CloudFormation template in the next section of this guide.

6. Custom AWS CodePipeline setup with AWS CloudFormation

Almost there, now we need to setup a custom CodePipeline using a CloudFormation template. When the CloudFormation template is executed, we’ll just have one more step to set this automation in motion.

First, we need:

  • an S3 Bucket to transfer source code from the Master (trunk) git branch.
    • This will be created for us in this template when we don’t have one yet.
  • a Lambda function to Generate, from that source code, a static site.
    • We just uploaded the zip file containing the Lambda function program and hugo static site generator binary.
  • Custom CodePipeline template that will outline these steps.
    • This is definined in the CloudFormation template we’re going to discuss in this chapter.

Please review the full CloudFormation template here: https://github.com/rpstreef/static-site-generator/blob/master/cf_stack.yml

The custom CodePipeline will look as follows in our CloudFormation template:

# Fetch code from Master branch, execute Lambda function to generate, compress and deploy static site.
  CodePipeline:
    Type: "AWS::CodePipeline::Pipeline"
    Properties:
      Name: !Sub "${DomainName}-codepipeline"
      ArtifactStore:
        Type: S3
        Location: !Ref ExistingCodePipelineBucket
      RestartExecutionOnUpdate: false
      RoleArn: !Sub "arn:aws:iam::${AWS::AccountId}:role/${CodePipelineRole}"
      Stages:
        - Name: Source
          Actions:
            - Name: SourceAction
              ActionTypeId:
                Category: Source
                Owner: AWS
                Provider: CodeCommit
                Version: 1
              Configuration:
                RepositoryName: !Ref ExistingGitRepository
                BranchName: master
              OutputArtifacts:
                - Name: SiteSource
              RunOrder: 1
        - Name: InvokeGenerator
          Actions:
            - Name: InvokeAction
              InputArtifacts:
                - Name: SiteSource
              ActionTypeId:
                Category: Invoke
                Owner: AWS
                Provider: Lambda
                Version: 1
              Configuration:
                FunctionName: !Ref GeneratorLambdaFunction
              OutputArtifacts:
                - Name: SiteContent
              RunOrder: 1

The pipeline consists of just two steps;

  1. “Source”: CodeCommit repository with the name “!Ref ExistingGitRepository”, will supply source code from the master branche. The !Ref part means it references what has been entered during CloudFormation stack creation for this field name.

  2. “InvokeGenerator” will invoke the Lambda function by the name of “!Ref GeneratorLambdaFunction”. This function will retrieve code stored in our CodePipeline bucket (made availble in step 1), transform it to a static site, GZIP it, copy the result to our website S3 bucket.

To be able for this CodePipeline to execute, we need authorization to do so. There are two Roles to be defined; the first for the CodePipeline itself, the second for the Lambda Function. The CodePipeline role must be able to assume roles set for AWS Lambda services and the CodePipeline.

# Allow all actions on Lambda and CodePipeline
  CodePipelineRole:
    Type: "AWS::IAM::Role"
    Properties:
      AssumeRolePolicyDocument:
        Version: "2012-10-17"
        Statement:
          - Effect: Allow
            Principal:
              Service:
                - "lambda.amazonaws.com"
                - "codepipeline.amazonaws.com"
            Action:
              - "sts:AssumeRole"
      Path: "/"
      Policies:
        - PolicyName: "codepipeline-service"
          PolicyDocument:
            Version: "2012-10-17"
            Statement:
              - Effect: "Allow"
                Action: "*"
                Resource: "*"

The role it will assume is outlined below, “${DomainName}-execution-policy”. It defines the execution policies for the AWS Lambda function which will do all the heavy lifting in this pipeline. It’s allowed all the actions specified in CodePipeline and S3 services:

#Allow Lambda access to; List, upload, get, delete, put object or acl on S3 and Set job results on CodePipeline
  LambdaExecutionRole:
    Type: AWS::IAM::Role
    Properties:
      AssumeRolePolicyDocument:
        Version: '2012-10-17'
        Statement:
          - Effect: Allow
            Principal:
              Service:
                - lambda.amazonaws.com
            Action:
              - sts:AssumeRole
      Path: "/"
      Policies:
        - PolicyName: !Sub "${DomainName}-execution-policy"
          PolicyDocument:
            Version: '2012-10-17'
            Statement:
              - Effect: Allow
                Action: "logs:*"
                Resource: "arn:aws:logs:*:*:*"
              - Effect: Allow
                Action:
                  - codepipeline:PutJobSuccessResult
                  - codepipeline:PutJobFailureResult
                Resource: "*"
              - Effect: Allow
                Action:
                  - s3:GetBucketLocation
                  - s3:ListBucket
                  - s3:ListBucketMultipartUploads
                  - s3:AbortMultipartUpload
                  - s3:DeleteObject
                  - s3:GetObject
                  - s3:GetObjectAcl
                  - s3:ListMultipartUploadParts
                  - s3:PutObject
                  - s3:PutObjectAcl
                Resource:
                  - !Join ["", ["arn:aws:s3:::", !Ref ExistingSiteBucket, "/*"]]
                  - !Join ["", ["arn:aws:s3:::", !Ref ExistingCodePipelineBucket, "/*"]]

The last bit of code in our CloudFormation template defines the Lambda resource to be created. It will provide an environment variable that points to our website bucket. The sizing, 256Mb of memory, should be sufficient to run the conversion and copy commands within 5-6 seconds per run.

GeneratorLambdaFunction:
    Type: "AWS::Lambda::Function"
    Properties:
      Description: !Sub "Static site generator for ${DomainName}"
      Role: !GetAtt LambdaExecutionRole.Arn
      MemorySize: 256
      Timeout: 300
      Runtime: !Ref GeneratorLambdaFunctionRuntime
      Handler: !Ref GeneratorLambdaFunctionHandler
      Code:
        S3Bucket: !Ref GeneratorLambdaFunctionS3Bucket
        S3Key: !Ref GeneratorLambdaFunctionS3Key
      Environment:
        Variables:
          SiteBucket: !Ref ExistingSiteBucket

To execute this template:

  • In your AWS Console go to; CloudFormation > Create Stack > Choose a template > Upload to S3 > Choose cf_stack.yml and then click next.
  • Set all the other properties to the values we have covered in the last chapters. Click next.
  • Optionally you can set tags for this stack to your liking. Click next.
  • Place a check mark at, “I acknowledge that AWS CloudFormation might create IAM resources.” then press Create.

Now it will do it’s work and it should show you in a minute or two that it is done successfully creating the stack.

To activate this automation we only need to push our local git repo changes to our CodeCommit master branch remote repo and it will trigger the CodePipeline and subsequently our Lambda function to deploy our static site with the latest changes!

Now execute a push to your AWS CodeCommit repo and watch the CodePipeline execution do it’s thing to automatically generate the static site and deploy it to your website S3 bucket.

7. What does it cost?

Now that we’ve finally got the whole thing running :) What does it actually cost to run on a monthly basis:

ServiceItemCostSubtotal
S35 GB storage$0.12$0.12
5000 put/list requests$0.03$0.15
100,000 Get and Other Requests$0.04$0.19
Inter region bucket transfers of 1GB/month (depends on CodePipeline/Commit region availability)$0.02$0.21
Route53Hosted zone$0.50$0.71
Standard queries: 1 million / month$0.40$1.11
CodePipeLine1 Free pipeline per month$0.00$1.11
CodeCommitFirst 5 users free with 50GB storage and 10,000 git requests/month$0.00$1.11
LambdaMemory: 256Mb you get 1,600,000 seconds of compute time for free$0.00$1.11
Free tierDiscount-$0.14$0.97
$0.97

Compare this to running your blog on Wordpress (no custom domain costs included):

ServiceMonetizeCost
Static site on AWSYes$1.11
Wordpress Free (No custom domain)No$0.00
Wordpress PersonalNo$2.99
Wordpress PremiumYes$8.25

If you need; blog monetization, analytics, storage flexibility, your own branding, and google analytics. Running a static site on AWS offers much better value at about 15% of the cost of the Wordpress Premium option.

8. Comments

I hope this guide has been useful, please leave a comment below to let me know what you liked, did not like, suggestions and so on.

Next week I’ll continue my OpenAPI series (not enough time this week) and I’ll cover what I promised, CI/CD with CodePipeline and Terraform.

Thanks for reading!