filter pic

I haven't exactly made it a secret that I use Pelican to write this blog. Sometimes I'm active, and sometimes I'm not. As pelican is a static site generator based on templates and text files, I have been doing the build and deployment by hand. When I have been away for a while, I sometimes find that I have changed machines and have to spend some time setting up my local machine to release an article. This doesn't take too much time, but sometimes I just want to pound out an article and don't have the machine set up.

In the past, I had thought about setting up some sort of AWS lambda function, docker container, or even a dedicated server for auto-deployment... I just didn't want the hassle of operating yet another machine. I took a few minutes to investigate bitbucket pipelines and I am impressed! I have been using bitbucket to control the files for the blog for years already, but now I can simply commit to master to build and deploy the site to the production AWS server or commit to test to get a preview. Too easy! Why didn't I do this before!?

The Steps

Step 1: Use bitbucket

No need to go outside your of your normal repository control... no external service required... and it comes with 50 min of build time each month. That should be more than enough to get your your blog fix.

Step 1.1 (optional): Create Build Environment on your Local Machine

This step isn't actually necessary, but it will ensure that your pelican build environment is actually functional before trying to commit to bitbucket and wait for it to build.

$ > virtualenv -p python3 venv
$ > venv/bin/pip install pelican markdown awscli
$ > venv/bin/pip freeze > requirements.txt

There is apparently a bit of a glitch with pkg-resources in debian-based environments. It is safe to delete the line in requirements.txt that specifies pkg-resources.

Add any plugins to your repository and configuration files.

This isn't intended to show you how to use pelican, but I generally execute venv/bin/pelican -s pelicanconf.py in order to build my site. My site is contained in ./output when it is complete. We will see references to this path in the next step.

Now that I can build locally with relative paths in my configuration file, I can proceed...

Step 2: Configure AWS S3

You should have a bucket assigned to AWS. The bucket that hosts this site is called forembed.com. The bucket should be configured to host a site. There are numerous guides to this, but basically, you have to:

  • make the bucket publically readable
  • configure to host a static site
  • set each file to be individually publically readable

Step 3: Create AWS User Credentails

Here, you will create a 'user' that bitbucket will utilize to interact with your AWS S3 bucket.

  • Create IAM user group such as deployment-s3. Be sure to attach the AmazonS3FullAccess policy to the group.
  • Create a user such as bitbucket and add the user to the deployment-s3 group (unless you named it something else). Be sure to tick the "Programmatic Access" box when prompted.
  • Under the user, you will need to create credentails and you will need to copy the Access Key ID and Secret Access Key into bitbucket.

There is a bit of a chicken-and-egg dilemma at this point. You have to enable bitbucket-pipelines to set environment variables for pipelines, but your deployment won't work without enviironment variables. Just enable the pipelines so that you can set your environment variables here.

  • Settings -> Pipelines -> Environment variables
  • AWS_ACCESS_KEY_ID = {your access key id}
  • AWS_SECRET_ACCESS_KEY = {your secret key}

Step 4: Add bitbucket-pipelines.yml

First, lets have a look at the bitbucket-pipelines.yml file:

image: python:3.5.1

pipelines:
  # create a full build that may be accessed at 
  # http://test-site-deployment.s3-website-us-east-1.amazonaws.com
  default:
    - step:
        name: Build and deploy to test server
        deployment: test
        script: # Modify the commands below to build your repository.
          - pip3 install -r requirements.txt
          - pelican -s pelicanconf.py
          - aws s3 sync --delete ./output s3://test-site-deployment/ --acl public-read

  branches:
    # commits to the master branch will deploy a new site at http://forembed.com
    master:
      - step:
          name: Build and deploy to production server
          deployment: production
          script:
            - pip3 install -r requirements.txt
            - pelican -s pelicanconf.py
            - aws s3 sync --delete ./output s3://forembed.com --acl public-read

The file is fairly readable for a knowledgeable guy, but there are two nearly identical pipelines shown here, one for all branches not named and master. Three primary stages:

  • environment setup (installing python packages)
  • build of static site, pelican -s pelicanconf.py
  • AWS sync (last line)

Environment Notes

You will need to specify pip3 to use python 3. If you have a script that you want to execute using the python 3 environment, be sure to preface with python3. I don't know why python 2 was the default.

AWS Notes

  • --delete will delete the contents of the bucket
  • ./output specifies that only the output path should be uploaded, not the entire git repository
  • --acl public-read sets the permissions so that the site is generally accessible

Coffee Break

'Nuff said.

I will probably build some version of this into my other projects as well. Go Atlassian!

Sources

The AWS and Bitbucket documentation are pretty good, but I found a blog post particularly helpful when it came to deployment to S3.



© by Jason R. Jones 2016
My thanks to the Pelican and Python Communities.