Auto-deploy SPA with AWS S3 and CloudFront using GitLab CI/CD

AWS Jun 26, 2018

In this blog, We will see how to host a single page website by storing files on AWS S3, serve them using CloudFront with GitLab CI/CD.

Single Page Application(SPA)

Single Page Application is a web application or website that interacts with the user without making a request to the server to fetch new HTML. There are many ways we can create SPA in many languages but we will use nodejs docker image for demo.

Setup S3 Bucket

Amazon S3 has a simple web services interface that we can use to store and retrieve any amount of data, at any time, from anywhere on the web.

  • Create an S3 bucket named exactly after the domain name, for example website.com then go into the bucket.
  • Select Properties tab, click the Static Website section.
  • Then select on Permissions tab, go to Bucket Policy. Copy below policy and paste into editor:
{
    "Version": "2008-10-17",
    "Statement": [
        {
            "Sid": "AllowPublicRead",
            "Effect": "Allow",
            "Principal": {
                "AWS": "*"
            },
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::BUCKET_NAME/*"
        }
    ]
}

Be sure to replace BUCKET_NAME with yours.

  • Click Save, it will ask you to confirm This bucket has public access, click Save again.
  • Now upload index.html into bucket and you will be able to see it on bucket enpoint url.

Setup CloudFront

Amazon CloudFront is a web service that speeds up distribution of our static and dynamic web content, such as .html, .css, .js, and image files, to our users.

  • Go to the CloudFront from Services, click on Create Distribution and select Get Started for Web.
  • In Origin Domain Name, paste the "endpoint" previously created in S3 (without http://).
  • Select yes for Compress Objects Automatically.
  • In Alternate Domain Names (CNAMEs), enter the domain name.
  • In Default Root Object, enter index.html.
  • Leave everything else as it is.
  • Click on Create Distribution.
  • We can see, our distribution status is "In Progress", it will take few minutes to get Deployed.
  • The distribution will have a domain name like d15mue0rz621ef.cloudfront.net.
  • Set A record for YOUR DOMAIN with above distribution domain name.
  • Now go to our distribution, select Error Pages.
  • We need to make sure all requests to the server (S3 in this case) return something even if no file exists, click on Create Custom Error Response.
    • HTTP Error Code: 404 Not Found
    • TTL: 0
    • Custom Error Response: Yes
    • Response Page Path: /index.html
    • HTTP Response Code: 404/200 (if have handled it on frontend, enter 404 else 200)

Setup GitLab CI/CD

GitLab offers a continuous integration service. If you add a .gitlab-ci.yml file to the root directory of your repository, then each commit or push, triggers your CI pipeline. The pipeline will generate our website files and then we will use aws cli to host our website. This is called CD(continuous deployment). GitLab provides it's own CI/CD pipeline. We will use it to automate our work.

  • Open your GitLab repository, go to Settings -> CI/CD from the left side menu.

  • Here we will enter our AWS credentials so that our automation script can use it.

  • Expand variables section and enter following variables and their values.

    • AWS_ACCESS_KEY_ID: YOUR-AWS-ACCESS-KEY-ID
    • AWS_SECRET_ACCESS_KEY: YOUR-AWS-SECRET-ACCESS-KEY
    • S3_BUCKET_NAME: YOUR-S3-BUCKET-NAME
    • DISTRIBUTION_ID: CLOUDFRONT-DISTRIBUTION-ID
  • Click on Save variables.

  • create new file .gitlab-ci.yml and copy below script into the file.

image: docker:latest

stages:
  - build
  - deploy

build:
  stage: build
  image: node:8.11.3
  script:
      - export API_URL="https://api.logicwind.com/"      //any evnvironment variables your code requires
      - npm install
      - npm run build
      - echo "BUILD SUCCESSFULLY"
  artifacts:
    paths:
      - dist/
    expire_in: 20 mins
  environment:
    name: production
  only:
    - master

deploy:
  stage: deploy
  image: python:3.5
  dependencies:
    - build
  script:
    - export AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID       // same variable we declared in gitlab CI settings
    - export AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY
    - export S3_BUCKET_NAME=$S3_BUCKET_NAME
    - export DISTRIBUTION_ID=$DISTRIBUTION_ID
    - pip install awscli --upgrade --user
    - export PATH=~/.local/bin:$PATH
    - aws s3 sync --acl public-read --delete dist $S3_BUCKET_NAME
    - aws cloudfront create-invalidation --distribution-id $DISTRIBUTION_ID --paths '/*'
    - echo "DEPLOYED SUCCESSFULLY"
  environment:
    name: production
  only:
    - master
build stage

Script will run build command and generate single page application source like html, css, js files under dist folder. Most important part is artifact part, we are telling our CI runner to keep /dist folder for next 20 mins, then it will deploy our artifact. basically our distribution code. we can download, test and verify our code whether is it exactly what we wanted or not.

deploy stage

we will use python interface and export aws credentials for authentication so that we can put our code on S3 bucket. The script will install awscli using pip and it will initialize it with our provided configs.

  • Next it will sync our provided folder with given bucket and delete, upload necessary files, folders with public read access.
  • Next, our script will run invalidate command on CloudFront to invalidate our existing files for given path, here /* because we want our all files to get indexed again.
  • Once we see DEPLOYED SUCCESSFULLY and Job succeeded on the console, we can say that we have successfully deployed our SPA on S3 and serving it using AWS CloudFront.
  • Next time we just have to push our changes on GitLab and all this happens in the background automatically and we will be able to see our changes live in just a few minutes.

Help links:

Tags

Great! You've successfully subscribed.
Great! Next, complete checkout for full access.
Welcome back! You've successfully signed in.
Success! Your account is fully activated, you now have access to all content.