Goodbye Dockerfiles: Build Secure & Optimised Node.js Container Images with Cloud Native Buildpacks
4 min read
Photo by Ian Taylor on Unsplash
Docker enables developers to easily package, share, and run applications. As a platform, it has shaped the way we build and run applications, and containers have become the de facto standard to run applications. A container image is a lightweight, standalone, executable package of software that includes everything needed to run an application: code, runtime, system libraries and settings. You need a Dockerfile to create an image.
When you tell Docker (or any similar tool) to build an image by executing the
docker build command, it reads the instructions in the Dockerfile, executes them, and creates an image as a result. Writing Dockerfiles is one thing, and writing Dockerfiles that are optimised for fast build and secure image as output is another thing. If you're not careful, you may create images that take a long time to build. Aside from the time it takes to build the image, they may also not be secure.
You can learn how to secure and optimise your container image of course! But, wouldn't you rather invest your time and resources in writing code and delegate the task of creating optimised images to a different tool? That's where Cloud Native Buildpacks can help.
What are Cloud Native Buildpacks?
Cloud Native Buildpacks are pluggable, modular tools that transform application source code into container images. Its job is to collect everything your app needs to build and run. Among other benefits, they replace Dockerfiles in the app development lifecycle, enable swift rebasing of images, and provide modular control over images (through the use of builders).
How do they work?
Buildpacks examine your app to determine the dependencies it needs and how to run it, then package it all as a runnable container image. Typically you run your source code through one or more buildpacks. Each buildpack goes through two phases - the detect phase and the build phase.
The detect phase runs against your source code to determine if a buildpack is applicable or not. If the buildpack is detected to be applicable, it proceeds to the build stage. If the project fails detection, the build stage for that specific buildpack is skipped.
The build phase runs against your source code to download dependencies and compile your source code (if needed), and set the appropriate entry point and startup scripts. Let's see how to create an image using the pack CLI.
Building your first image
You're going to build your first image using the pack CLI. Go to buildpacks.io/docs/tools/pack and follow the instruction for your OS to install it. You're going to create and deploy a Node.js web app that will return a string.
Run the command below to create the project and install micro (an HTTP library for building microservices)
mkdir micro-app && cd micro-app && npm init -y && npm i micro
Create a file named index.js. Copy and paste the function below in it.
module.exports = () => "Hello Buildpacks!";
Update your package.json with the following start script
And that is all you need for the service. Run the command below to create an image.
pack build micro --builder paketobuildpacks/builder:base
That command should create an image using the
paketobuildpacks/builder:base builder. A builder is an image that contains all the components necessary to execute a build. There are different builders from Paketo, Heroku and others. You can even create yours or extend an existing one.
If you use Heroku, then your app is making use of Buildpacks behind the scenes. You can choose to build your images using Heroku buildpacks so you can have the same image when you deploy to Heroku or other platforms.
The image is built and you can try it out by running it with Docker (or Podman if that's what you use). Run the docker command below to start the app.
docker run -p 3000:3000 -e PORT=3000 micro
Go to localhost:3000 in your browser. You should get the text
Hello Buildpacks! as a response.
Usage in CI/CD
You can build images with Cloud Native Buildpacks and pack CLI in your continuous integration pipeline. With GitHub Actions, there's a Pack Docker Action that you can use. When you combine it with Docker Login Action, you can build and publish to a registry in your workflow. There's a similar process on GitLab if you use GitLab's Auto DevOps. I won't go into details on how to use Buildpacks in different CI systems, but you can check out the links below:
- Auto Build using Cloud Native Buildpacks in GitLab
- Pack Docker GitHub Action. It can be combined with Docker Login.
- Tekton Buildpacks task, available on Tekton Hub. It doesn't require the pack CLI or Docker.
Did you find this article valuable?
Support Peter Mbanugo by becoming a sponsor. Any amount is appreciated!