GatsbyJS with CI/CD Pipeline via Codebuild

GatsbyJS with CI/CD Pipeline via Codebuild

With the free contingent for AWS you always get an active AWS code pipeline per month and 100 minutes AWS codebuild per month.

So you can set up a continuous integration and continuous delivery pipeline for free or with more than 100 build minutes a month for relatively little money, which triggers a build with every push to a GitHub repository whoch will be automatically deployed to S3 and optionally also invalidate CloudFront Cache.

CodeBuild

First of all you need a new build project in CodeBuild. In the project configuration you can assign a name for it and select GitHub as the source provider under Source.

Depending on whether the repository is public or not, you then select "Public Repository" and enter the repository URL or you link your GitHub account and give CodeBuild the necessary rights to access the repository.

An environment image must now be selected under Environment. For GatsbyJS this would be a "managed image" and the operating system "Amazon Linux 2". "Standard" is selected as the runtime (s) and "aws / codebuild / amazonlinux2-x86_64-standard: 2.0" as the image.

Sorry, somehow the image is not available :(

Now a new service role can be created automatically (which is required) so that CodeBuild has the necessary rights for the AWS account.

This service role can also be assigned rights for CodePipeline, so that this service role can be used for CodeBuild and CodePipeline. If environment variables are used, these can be specified under "Additional configuration" in the environment. You can also make sure that "3GB RAM, 2vCPUs" is really selected, since only this option is included in the free contingent.

Buildspec now uses a buildspec file in YAML format. For a Gatsby site this should somehow look like the following:

sh
Copy code
version: 0.2
phases:
    install:
        runtime-versions:
            nodejs: 12
        commands:
            - 'touch .npmignore'
            - 'npm install -g gatsby'
    pre_build:
        commands:
            - 'npm install'
    build:
        commands:
            - 'npm run build'
    post_build:
        commands:       
            - 'find public -type f -regex ".*\.\(htm\|html\|txt\|text\|js\|css\|json\)$" -exec gzip -f -k {} \' ## sofern cloudfront nicht automatisch die dateien komprimiert
artifacts:
    base-directory: public
    files:
        - '**/*'
    discard-paths: no
cache:
    paths:
        - '.cache/*'
        - 'public/*'

The buildspec.yml file only needs to be placed in the root directory so that CodeBuild can find it. In addition, the build script must of course still be available in "package.json".

js
Copy code
"build": "gatsby build",

The default settings can be retained under Artifacts. With CloudWatch you have the possibility to save logs for CodeBuild in an S3 bucket.

There may be additional costs!

If all settings have now been entered correctly, the build project can be created. The only thing missing now is the code pipeline that triggers a build and deployed it in the S3 bucket.

CodePipeline

For this you switch to CodePipeline and create a new pipeline in which a name and a service role are selected first.

At Source you can now log in with a GitHub account and link the respective repository with a branch. You now have two options to trigger a build.

  • GitHub-Webhooks and
  • AWS CodePipeline

Then you choose the build provider "AWS CodeBuild" and the previously created project or (if you have not already done this) create a new project.

After a build, the public/ folder can also be automatically deployed to an S3 bucket with AWS CodeDeploy. Alternatively you can also skip this step and gatsby-plugin-s3, which also optimizes caching.

ssh
Copy code
npm i gatsby-plugin-s3

bzw.

bash
Copy code
yarn add gatsby-plugin-s3

Now the configuration in gatsby-config.js and the deployment script are missing

js
Copy code
plugins: [
  {
      resolve: `gatsby-plugin-s3`,
      options: {
          bucketName: 'my-website-bucket'
      },
  },
]
js
Copy code
"scripts": {
    ...
    "deploy": "gatsby-plugin-s3 deploy --yes"
}

The deployment script "npm run deploy" must then of course be added to the buildspec file under post-build commands. The CodePipeline should now look something like this:

Sorry, somehow the image is not available :(

Experience shows that a build for around 100 pages takes around 10 minutes if you have some pictures per page.

Every time CodePipeline detects a push to the GitHub repository, a build is automatically triggered and made available on an S3 bucket.

With aws cloudfront create-invalidation --distribution-id DISTRIBUTION_ID --paths / * the CloudFront cache can also be invalidated.

First published August 31, 2020

    0 Webmentions

    Have you published a response to this? Send me a webmention by letting me know the URL.

    Found no Webmentions yet. Be the first!

    Write a comment

    About The Author

    Max
    Max

    Geospatial Developer

    Hi, I'm Max (he/him). I am a geospatial developer, author and cyclist from Rosenheim, Germany. Support me

    0 Virtual Thanks Sent.

    Continue Reading

    1. How to deploy your GatsbyJS site on your own server

      With Gatsby 4 bringing in Server-Side Rendering (SSR) and Deferred Static Generation (DSG) you need an alternative methode to just hosting static files. Each page using SSR or DSG will be rendererd after a user requests it so there has be a server in the background which will handle these requests and build the pages if needed.

      Continue reading...

    2. Build and deploy your Gatsby site with Google Cloud Build to Firebase

      Ultimate guide to automate your Gatsby builds with Google Cloud Build, deploying to Firebase and optional Cloud Scheduler.

      Continue reading...

    3. Hosting NextJS on a private server using PM2 and Github webhooks as CI/CD

      This article shows you how can host your Next.js site on a (virtual private) server with Nginx, a CI/CD pipeline via PM2 and Github Webhooks.

      Continue reading...