The Jenkinsfile
DSL is a powerful tool to create dynamic and rich continuous integration/deployment (CI/CD)
pipelines. As more of your projects begin using Jenkins, inevitably, common patterns will begin
to emerge. The more projects, the more desirable it will be to reduce the copy/pasting of pipeline
stages or steps. This post covers a tool Jenkins provides to encapsulate common logic/workflows in
an external library, known as a Jenkins Shared Library.
Let's say you are a freelance React engineer who has created and hosted websites for multiple clients.
Through this time, you've come to create a standard set of build/deployment steps for your sites. Your
workflow consists of developing features off of feature branches which automatically deploy to a test site
and when code is merged to the main
branch, the site is deployed to production. This leads to a Jenkinsfile
that looks something like this (assumes infrastructure for site is already setup and bucket sync is only needed):
pipeline {
agent any
stages {
stage("Init") {
steps { sh "yarn" }
}
stage("Build") {
steps { sh "yarn build" }
}
stage("Test") {
steps { sh "yarn test" }
}
stage("Deploy Feature") {
agent { docker { image "amazon/aws-cli" } }
when { not { branch "main" } }
environment {
AWS_ACCESS_KEY_ID = credentials("aws-access-key-id")
AWS_SECRET_ACCESS_KEY = credentials("aws-secret-access-key")
}
steps { sh "aws s3 sync ./build s3://${env.REPO_NAME}-test --delete" }
}
stage("Deploy Production") {
agent { docker { image "amazon/aws-cli" } }
when { branch "main" }
environment {
AWS_ACCESS_KEY_ID = credentials("aws-access-key-id")
AWS_SECRET_ACCESS_KEY = credentials("aws-secret-access-key")
}
steps { sh "aws s3 sync ./build s3://${env.REPO_NAME}-prod --delete" }
}
}
}
Sample Jenkinsfile
for a React pipelineThis Jenkinsfile
will initialize, build, test, and deploy a React application (or any framework, really) to
an S3 bucket. Now as a freelancer with many clients, you have many repositories all using this same
format for CI/CD. Creating new websites can be easily done using a template of sorts, but what if you
wanted to add a new stage to all of your builds? Maybe you'd like to add an automated Lighthouse
check after deployment to "test" that fails the build if it's less than a certain threshold. While possible
to do across multiple repositories, this would be cumbersome. A better, long-term, option here would be to create
a Jenkins Shared Library.
A shared library is a Groovy repository that follows a specific convention for Jenkins. At a minimum, Jenkins
expects a vars/
directory in the root of the repository where each .groovy
file in there is callable within a
Jenkinsfile
. For more detailed information on the full repository structure, see Extending with Shared Libraries.
There are two types of shared libraries, a Global Shared Library and, what I call, a Local Shared Library.
A Global Shared Library is one that is configured at the system level of a Jenkins installation, and it's functions are automatically injected into each build. These scripts have unrestricted access to anything on the Jenkins classpath and are also able to use Grape to load their own dependencies if needed. These libraries should be used sparingly and with great access controls as the code will have full unrestricted access to the entire Jenkins instance (settings/credentials/filesystem/etc). These types of libraries can be used to provide very rich functionality without the need to develop a Jenkins plugin.
A Local Shared Library is one that is only locally available to a build. This can either be declared at
a folder scope or declared directly in a Jenkinsfile
itself. Local libraries run in the Jenkins
Groovy Sandbox which has a massive restrictions on what features of the Groovy/Java language
actually be used. What is available can be controlled in the Jenkins system settings, but I would recommend
leaving the defaults and using these libraries strictly to encapsulate and generalize what would normally be
directly declared in a build file.
Where a script can be used and what it does within a Jenkinsfile
varies. The two main uses are to provide
functionality within the steps
section or to encapsulate an entire pipeline (all stages and steps). It is up to the
developer of the script to document how it is intended to be used. The instructions below are going to assume a
library repository has already been created and configured in Jenkins for auto-injection into the build. In the
Papio Pipelines Use section, we'll talk about how to easily get a Local Shared Library
working in a Jenkinsfile
using Papio Pipelines: Managed Jenkins for GitHub.
A build step script will take a series of steps or a complicated step and consolidate it into a simpler DSL to
be called within the steps
section of a pipeline. For example, let's say we wanted to encapsulate your
website deployment step.
...
// prod deploy stage
environment {
AWS_ACCESS_KEY_ID = credentials("aws-access-key-id")
AWS_SECRET_ACCESS_KEY = credentials("aws-secret-access-key")
}
steps { sh "aws s3 sync ./build s3://${env.REPO_NAME}-prod --delete" }
...
Website deploy to S3 stepTo do this, a file vars/deployWebsite.groovy
can be added to the shared library repository. In this file,
we'll add a single method named call
which is required to allow the script to be callable as a function
by Jenkins. We'll also pass the environment name as an argument to this function. That way, it can be reused
for both the test
and prod
deploy steps. It is standard to use a Map
as the argument type to be able to pass
multiple configuration properties to the step.
def call(Map props) {
if (!props.environment) {
error("'environment' property must be supplied to 'deployWebsite'")
}
sh "aws s3 sync ./build s3://${env.REPO_NAME}-${props.environment} --delete"
}
deployWebsite.groovy
example fileIn this script, calls are made exactly as if the code was running directly in a Jenkinsfile
. When checking for
required properties, the error
step is used to fail the build if the expected property is not supplied. The sh
step
is then used to call the aws cli exactly as it was in the Jenkinsfile
. With this script in place, the "deploy" stage
steps now look like this:
...
// test deploy stage
environment {
AWS_ACCESS_KEY_ID = credentials("aws-access-key-id")
AWS_SECRET_ACCESS_KEY = credentials("aws-secret-access-key")
}
steps { deployWebsite(environment: "test") }
...
// prod deploy stage
environment {
AWS_ACCESS_KEY_ID = credentials("aws-access-key-id")
AWS_SECRET_ACCESS_KEY = credentials("aws-secret-access-key")
}
steps { deployWebsite(environment: "prod") }
...
Website deploy to S3 step using deployWebsite
stepOne could argue that this logic could easily be encapsulated in the Node scripts
section of the package.json
,
which is true. That said, it's possible to take this one step further and make it even more useful by setting up the
environment as part of the deployWebsite
script. For this, the withCredentials step is used to automatically
inject the AWS access key and secret key into the shell when we run the cli command is executed.
def call(Map props) {
if (!props.environment) {
error("'environment' property must be supplied to 'deployWebsite'")
}
withCredentials([
string(credentialsId: "aws-access-key-id" variable: "AWS_ACCESS_KEY_ID"),
string(credentialsId: "aws-secret-access-key", variable: "AWS_SECRET_ACCESS_KEY")
]) {
sh "aws s3 sync ./build s3://${env.REPO_NAME}-${props.environment} --delete"
}
}
deployWebsite.groovy
example file with credential injectionUsing withCredentials
, the pipeline stage can now be simplified even further:
pipeline {
agent any
stages {
stage("Init") {
steps { sh "yarn" }
}
stage("Build") {
steps { sh "yarn build" }
}
stage("Test") {
steps { sh "yarn test" }
}
stage("Deploy Feature") {
agent { docker { image "amazon/aws-cli" } }
when { not { branch "main" } }
steps { deployWebsite(environment: "test") }
}
stage("Deploy Production") {
agent { docker { image "amazon/aws-cli" } }
when { branch "main" }
steps { deployWebsite(environment: "prod") }
}
}
}
Sample Jenkinsfile for a React pipeline using deployWebsite
scripted stepWhile a step script is great for encapsulating the logic and configuration required for reusable steps across multiple
stages, the encapsulation can be taken even further by wrapping an entire pipeline configuration in a script. To do this,
a file is created called like before with a call
method. This file will be vars/reactPipeline.groovy
and the entire
pipeline
section is pasted inside the implementation.
def call(Map props) {
pipeline {
agent any
stages {
stage("Init") {
steps { sh "yarn" }
}
stage("Build") {
steps { sh "yarn build" }
}
stage("Test") {
steps { sh "yarn test" }
}
stage("Deploy Feature") {
agent { docker { image "amazon/aws-cli" } }
when { not { branch "main" } }
steps { deployWebsite(environment: "test") }
}
stage("Deploy Production") {
agent { docker { image "amazon/aws-cli" } }
when { branch "main" }
steps { deployWebsite(environment: "prod") }
}
}
}
}
reactPipeline.groovy
script that encapsulates an entire pipeline configThis simplifies the entire Jenkinsfile
down to single call to this script!
reactPipeline()
Sample Jenkinsfile
using reactPipeline.groovy
DSLWe've officially externalized the entire pipeline into a shared library. It's now possible to seamlessly configure the
pipelines for all repositories. To add a new Lighthouse stage after deploy to test, just include the logic in the
reactPipeline.grooy
script as if it were a Jenkinsfile
.
...
stage("Deploy Feature") {
...
}
stage("Check Lighthouse") {
agent { docker { image "<image with lighthouse and chrome installed>" } }
when { not { branch "main" } }
steps {
// Follows convention REPO_NAME is the DNS of website
sh "lighthouse --output json --output-path ./report.json --quiet --chrome-flags='--headless' https://${env.REPO_NAME}"
script {
Map failures = [:]
def report = readJSON(file: "report.json")
for (def category : report.categories) {
if (category.score < 0.8) {
failures[category.id] = category.score
}
}
if (failures) {
error("Lighthouse failures detected: ${failures}")
}
}
}
}
...
reactPipeline.groovy
updated to add a Lighthouse check stageFrom here, the next steps would be to update the reactPipeline.groovy
to use props
to control settings
and provide defaults. For example, you may want to be able to configure website output folder to sync to S3, configure
whether to use NPM or Yarn, test/production bucket names/urls, lighthouse scores to fail one, etc. You may have also
noticed we were actually able to use the deployWebsite
step from within our pipeline script. That is because all
vars
are callable between each other. We could also abstract the Check Lighthouse
stage steps into a separate
script to further simplify the reactPipeline()
script.
Now that we've shown how to create a shared library and define scripts to encapsulate build or pipeline logic, we'll show how this process has been streamlined for use within Papio Pipelines, a managed Jenkins for GitHub solution. For instructions on installing the app and setting up your first build, see our Welcome blog post.
Pipelines provides a custom plugin with special DSL available for use in a Jenkinsfile
on the app. For shared
libraries, it provides the ability to easily inject a local shared library into the build using a
Simple DSL.. For reference, using Jenkins DSL only, below is how you would inject a private shared library
repository from your GitHub organization into a Jenkinsfile
:
library(
identifier: 'jenkins-shared-library@main',
retriever: modernSCM(
[
$class: 'GitSCMSource',
remote: 'https://github.com/my-organization/jenkins-shared-library.git',
credentialsId: 'my-github-credentials'
]
)
)
pipeline {
...
}
Jenkinsfile
with shared library applied Jenkins default DSLTo get this working, you would need to generate credentials for GitHub and make sure they are available to the build.
You also need to have this difficult to read configuration in each Jenkinsfile
looking to use the library. If anything
were to change with the repository name or credentials id, you're back to making changes to every repository. Using
Pipelines, the DSL is simplified to:
gitHubLibrary('jenkins-shared-library')
pipeline {
...
}
Jenkinsfile
with shared library applied using Papio Pipelines DSLThis DSL makes it much easier to see what is going on in the Jenkinsfile
. It will also work seamlessly if the
repository is public or private with the organization of the build. We, at Papio Cloud, also provide a
Shared Library Template for creating new shared libraries that provide examples for scripts
as well as examples for unit testing scripts with Spock.
It's easy to see how Jenkins provides a more powerful build encapsulation mechanism than any other Continuous Integration/Deployment solutions out there using Jenkins Shared Libraries. Encapsulating builds allows us to adapt to new requirements in the same way we would want when writing any type of software. Using Papio Pipelines, we can get an even more powerful experience using its automatic integration with GitHub. In a future blog post, we'll cover how to unit test Shared Libraries.