I have a soft spot in my heart for Google App Engine Standard. I started experimenting with the first generation python runtime nearly 10 years ago now in my spare time and have been using GAE in a production capacity professionally and for personal projects ever since. Since App Engine is VM based, it has the downside of not being compatible with Docker. While the Flexible environment introduced containerization support, I still feel Flexible environment is inferior to Standard environment with the former's slow deploy times, lack of scaling to 0 and, slowness to start new instances. While Google Cloud Run appears to introduce all the benefits of both, it is still in Beta. Meanwhile, we can leverage Google Cloud Build to do Docker-esque build/test/deploy workflows. In this post we will explore using GCB to generate a "container" and deploy it in separate steps.
If you already have a bucket you would like to use, jump to step 2. Otherwise, I highly reccommend creating a dedicated bucket for your builds.
Go to Google Cloud Storage and Create a Bucket. See the screenshot below for your bucket options. I'd role with these settings:
Now that you have a place to put our builds, lets put some builds there.
In this step we will build our code, create a tar archive of it, and move that artifact to Google Cloud Storage for later use. I am leveraging the TAG_NAME variable from cloud build so that I can name the archived file. Also, we use the cloud storage gae_builds bucket we created in step above.
Add the following content to ./ci/build.gaestandard.cloudbuild.yaml
################################################################
# Create a Production Build of Node Application #
# and store Tar Archive as artifact in our GCS bucket #
################################################################
steps:
- name: 'gcr.io/cloud-builders/gsutil'
entrypoint: 'bash'
args:
- '-c'
- |
echo "Creating production build with for tag $TAG_NAME"
# List contents for kicks
ls -l
# Remove anything overwritten by production build process
rm -rf build
# Create Production Optimized Install of node_modules
- name: 'gcr.io/cloud-builders/npm'
args: ['ci', '--only=production']
# Run a Build
- name: 'gcr.io/cloud-builders/npm'
args: ['run', 'build']
# Delete files needed for build/test but not needed for later deploy
- name: 'gcr.io/cloud-builders/gsutil'
entrypoint: 'bash'
args:
- '-c'
- |
ls -l
rm -rf components
rm -rf pages
rm -rf node_modules
rm app.yaml
- name: 'gcr.io/cloud-builders/gsutil'
entrypoint: 'bash'
args:
- '-c'
- |
ls -l
tar -czvf build-$TAG_NAME.tar.gz .
ls -l
artifacts:
objects:
location: 'gs://gae-builds'
paths: ['build-$TAG_NAME.tar.gz']
Then run the build locally by entering:
gcloud builds submit --ignore-file=./ci/build.gcloudignore --config=./ci/build.gaestandard.cloudbuild.yaml --project=blaine-garrett --substitutions=TAG_NAME="v-2-3"
This will take the code in your current directory, run a production npm install on it, remove files needed to build but not to deploy or run, create a tar archive of the code, and push the artifact to your bucket (in my case gs://gae_builds/v-1-2-3.tar.gz)
Finally, go look at the contents of your bucket and ensure you have a file v-2-3.tar.gz or similar. If you have a file, celebrate.
We new have a proudction build of our code that is suitable for deployment. We can now deploy to any number of environments and it will be the exact same code. For use, we're going to deploy to App Engine Standard. Let's use Cloud Build to make this easier:
################################################################
# Create a Production Build of Node Application #
# and store Tar Archive as artifact in our GCS bucket #
################################################################
steps:
- name: 'gcr.io/cloud-builders/gsutil'
entrypoint: 'bash'
args:
- '-c'
- |
Deploying production build with $TAG_NAME
ls -l
- name: 'gcr.io/cloud-builders/gsutil'
args: ['copy', 'gs://gae-builds/build-$TAG_NAME.tar.gz', '.']
- name: 'gcr.io/cloud-builders/gsutil'
entrypoint: 'bash'
args:
- '-c'
- |
# Unzip into current directory
tar -zxvf build-$TAG_NAME.tar.gz
# List contents for debugging
ls -l
# Move the production yaml in plac
mv ./ci/deploy.gaestandard.app.yaml ./app.yaml
# Remove bits not needed for deploy
rm build-$TAG_NAME.tar.gz
rm -rf ci
# Deploy to App Engine Standard
- name: 'gcr.io/cloud-builders/gcloud'
args:
[
'--verbosity=debug',
'--project=$PROJECT_ID',
'app',
'deploy',
'./app.yaml',
'--no-promote',
'--version=$TAG_NAME',
]
Actually deploy your code by running:
gcloud builds submit --config=./ci/deploy.gaestandard.cloudbuild.yaml --project=blaine-garrett --substitutions=TAG_NAME="v-2-3" --no-source
This will download the tar file created above, unzip it, and deploy to Google App Engine to the service defined in the ./ci/deploy.gaestandard.cloudbuild.yaml with version name v-2-3
If you go to your Google App Engine console, you should see your new version in the service. If you migrate traffic, you should see the running code.
Notes:
I found it helpful to see what was actually being zipped up into my final tar after the build in step 2. This is accomplished by the running the following commands locally:
gsutil copy gs://gae-builds/build-v-2-3.tar.gz
tar -zxvf build-v-2-3.tar.gz
You can now inspect the code locally to see what has been included. You should also be able to run your app with NODE_ENV=production npm start
Even though App Engine Standard doesn't support Docker, we can still leverage Google Cloud Build to emulate a docker-esque 2 step build/deploy workflow. This moves us closer to our goal of building a proper CI/CD flow. If we really wanted to, we could create a Build set up that build both the Docker Image as well as the tar file and deploy either depending on the target environment. This is the next step in our efforts as we work towards slowly switching to Google Cloud Run.
Cheers.
See the full diff for these changes in my node-next-gae-demo