Jenkins Pipeline Tutorial
The Jenkins Pipeline plugin is an exciting new way of handling software builds in Jenkins. The central concept is Pipeline as Code, where your build logic resides inside a file versioned along with your application sourcecode. This tutorial guides you through the parts you need to understand.
This tutorial has been updated to Jenkins 2.28.
Get Jenkins
The fastest way to get started is to install Docker and run:
docker run -p 8080:8080 --name jenkins-tutorial \
--env JAVA_OPTS='-Dhudson.model.DirectoryBrowserSupport.CSP=' \
jenkinsci/jenkins
Setting the system property hudson.model.DirectoryBrowserSupport.CSP
to an empty value is a workaround for the Javascript sandbox which breaks the Javadoc viewer. See: Configuring Content Security Policy.
Open Jenkins by navigating to http://dockerhost:8080/, where dockerhost
is the hostname or IP address of your Docker host.
Minimal Jenkins configuration
If you’ve selected the recommended plugins during installation of Jenkins you’re all set. If you opted to install none, you need to do the following.
Go to Manage Jenkins > Manage Plugins
On the Available tab and select the following plugins (use the filter box on the top right):
- Pipeline
- Git plugin
Click Install without restart
Go back to Manage Plugins. On the Updates tab, select all and click Download now and install after restart
This will update the default plugins, install several new plugins and create a new job type called Pipeline. After Jenkins is finished restarting you are good to go.
Create a simple Pipeline
Click New Item, name it
simple-pipeline
, select Pipeline as the type and click OKEnter the following script in the Pipeline section:
echo "Hello world"
Click Save and then Build Now
Click on build #1 in the Build History and then Console Output
Started by user anonymous [Pipeline] echo Hello world [Pipeline] End of Pipeline Finished: SUCCESS
What did I just do?
You executed a Groovy script using the Jenkins Pipeline DSL. The echo
method simply outputs a message to the console. Because it’s Groovy, you could have done something like this:
System.out.println "Hello world" // written to standard out, not to the console
However, calls like this are not permitted when running inside the Groovy Sandbox. You need to explicitly approve these method calls on the In-process Script Approval page.
Note in Groovy you can do
println
without specifyingSystem.out
, but in the Pipeline DSL it is an alias ofecho
Groovy Sandbox and Script Approval
At some point you may want to use a Groovy language feature which is not allowed in the Groovy Sandbox. When that happens, you’ll be greeted by a RejectedAccessException
, such as:
org.jenkinsci.plugins.scriptsecurity.sandbox.RejectedAccessException: ↵
Scripts not permitted to use method java.lang.String replaceAll java.lang.String java.lang.String
Fortunately there’s a way to whitelist methods on the In-process Script Approval page.
Go to Manage Jenkins > In-process Script Approval
Review the pending signatures and click Approve to add them to the whitelist
Re-run your job; it should no longer fail (for this particular method call)
Some new terms
- DSL
- Domain Specific Language; a set of methods specific to a domain, in this case build pipelines
- Groovy
- A modern JVM language compatible with Java; in this case interpreted as script (not compiled)
- Sandbox
- A closed environment which can safely execute arbitrary code
Executors and Workspaces
Configure the job (or create a new one) and change the script to the following. The node
method allocates an executor and creates a (temporary) workspace.
node {
echo "Hello world"
}
Started by user anonymous
[Pipeline] Allocate node : Start
Running on master in /var/jenkins_home/jobs/simple-pipeline/workspace
[Pipeline] node {
[Pipeline] echo
Hello world
[Pipeline] } //node
[Pipeline] Allocate node : End
[Pipeline] End of Pipeline
Finished: SUCCESS
If you don’t specify a name or label for the node, any available node may be selected. This example selects a specific type of node:
node('wildfly') {
// some block executing on a Jenkins slave node called 'wildfly'
}
Some new terms
- Node
- The Jenkins Master or a Jenkins Slave (to which Jenkins Master delegates builds; see: Distributed builds)
- Workspace
- A directory on the server where your job is allowed to store and manage files, such as source code checked out from a version control system (VCS). Any files (artifacts) you wish to keep after the job finishes need to be archived
Loading scripts
The Pipeline DSL allows you to load other scripts, enabling sharing of build logic. Optionally the script can return a reference to itself, allowing you to store the script in a variable for later use.
Configure the job and change the script to the following. The created script is executed immediately after loading.
node { // Use the shell to create the file 'script.groovy' sh '''echo ' echo "Hello from script" ' > script.groovy''' load 'script.groovy' }
Started by user anonymous [Pipeline] Allocate node : Start Running on master in /var/jenkins_home/jobs/simple-pipeline/workspace [Pipeline] node { [Pipeline] sh [workspace] Running shell script ++ echo echo "Hello from script" [Pipeline] load: Loaded script: script.groovy [Pipeline] load { [Pipeline] echo Hello from script [Pipeline] } //load [Pipeline] } //node [Pipeline] Allocate node : End [Pipeline] End of Pipeline Finished: SUCCESS
Configure the job and change the script to the following. The created script defines a
hello(name)
method. Thereturn this
statement gives a reference to the Groovy Script Object to the caller, which can invoke it at any time.node { // Use the shell to create the file 'script.groovy' sh '''echo ' def hello(name) { echo "Hello ${name} from script" } return this ' > script.groovy''' def script = load 'script.groovy' script.hello('Roy') }
Started by user anonymous [Pipeline] Allocate node : Start Running on master in /var/jenkins_home/jobs/simple-pipeline/workspace [Pipeline] node { [Pipeline] sh [workspace] Running shell script ++ echo def hello(name) { echo "Hello ${name} from script" } return this [Pipeline] load: Loaded script: script.groovy [Pipeline] load { [Pipeline] } //load [Pipeline] echo Hello Roy from script [Pipeline] } //node [Pipeline] Allocate node : End [Pipeline] End of Pipeline Finished: SUCCESS
Loading a script from another Git repository
This requires the Pipeline Remote File Loader plugin. The example assumes you have a repository somewhere that contains a Pipeline.groovy
file, which you want to download to your Jenkins workspace.
Go to Manage Jenkins > Manage Plugins and install
Pipeline Remote File Loader
(without restart)Configure a Pipeline job and set the Pipeline script to the following:
node { stage 'Load pipeline script' def pipeline = fileLoader.fromGit( 'Pipeline.groovy', 'https://bitbucket.org/your-account/your-build-scripts.git', 'master', // use a branch or tag 'your-account-credential-id', // ID from Credentials plugin '' ) checkout scm pipeline.execute() }
Jenkinsfile
Instead of defining the Pipeline DSL inside the job we can create a special file in the root of a source code repository called Jenkinsfile
, which is automatically executed after the repository is checked out.
Click New Item, name it
jenkinsfile-pipeline
, select Pipeline as the type and click OKAt Pipeline, Definition select Pipeline script from SCM
- SCM: Git
- Repository URL: https://bitbucket.org/your-account/your-repo.git
- Set credentials if this is not a public repository
- Script Path: Jenkinsfile (this is the default value)
Click Save and then Build Now
Click on build #1 in the Build History and then Console Output
Started by user anonymous Cloning the remote Git repository Cloning repository https://bitbucket.org/your-account/your-repo.git ... [Pipeline] Allocate node : Start Running on master in /var/jenkins_home/jobs/jenkinsfile-pipeline/workspace [Pipeline] node { [Pipeline] echo Hello from Jenkinsfile [Pipeline] } //node [Pipeline] Allocate node : End [Pipeline] End of Pipeline Finished: SUCCESS
Multibranch Pipeline
A Multibranch Pipeline is a job type that scans all branches in a given repository for a Jenkinsfile. It automatically creates jobs for each branch inside a folder and executes Jenkinsfile for each job. This is useful in common branching workflows such as Git Flow which has naming conventions for branches, such as the feature/
prefix for feature-branches.
Go to Manage Jenkins > Manage Plugins and install the
Multibranch: Pipeline
plugin (without restart)Click New Item, name it
multibranch-pipeline
, select Multibranch Pipeline as the type and click OKAt Branch Sources click Add source, Git
- Project Repository: https://bitbucket.org/your-account/your-repo.git
- Set credentials if this is not a public repository
Click Save
The plugin will automatically scan the repository for branches and create jobs; refresh the page if necessary to reveal them. After creating the jobs they will be automatically executed.
When one of the jobs fails, this could again be a script call signature that should be whitelisted. Go to /scriptApproval/ again to approvie the signature.
Note the environment variable
BRANCH_NAME
allows you to detect which branch the Jenkinsfile is currently running in.
Running Gradle tasks with Jenkins Pipeline
The first thing you need to decide is whether you want to install Gradle as a tool within Jenkins, or use the Gradle Wrapper script in your repository.
Gradle Wrapper
The recommended way is to always use the Gradle Wrapper using the gradlew
script in the root of your repository, which will automatically install the required version. First install the wrapper by running the wrapper
task inside your project:
$ gradle wrapper
:wrapper
BUILD SUCCESSFUL
Total time: 1.234 secs
The following files are created by the wrapper task:
- gradle/wrapper/gradle-wrapper.jar
- gradle/wrapper/gradle-wrapper.properties
- gradlew
- gradlew.bat
You need to push the files to the remote tracking branch so Jenkins can access them:
git add gradlew* gradle/wrapper
git commit -m "Add Gradle Wrapper"
git push
Then you can access the Gradle Wrapper like this from your Jenkinsfile:
node {
sh './gradlew tasks'
}
You probably should create a friendly method to wrap the call:
def gradle(command) {
sh "./gradlew ${command}"
}
node {
gradle 'tasks'
}
Gradle plugin
Alternatively, you can manage the Gradle installation within Jenkins using the Gradle plugin.
Go to Manage Jenkins > Manage Plugins and install
Gradle plugin
(without restart)Go to Manage Jenkins > Configure System
At Gradle, click Add Gradle
- Name: gradle
- Install automatically (from gradle.org): checked
- Version: 2.10
You can access this installation from your Jenkinsfile via the tool
method, for example:
node {
def GRADLE_HOME = tool name: 'gradle', type: 'hudson.plugins.gradle.GradleInstallation'
sh "${GRADLE_HOME}/bin/gradle tasks"
}
Note the
tool
method returnsGRADLE_HOME
, not the actual location of the Gradle executable/bin/gradle
Wrapped in a friendly method:
def gradle(command) {
sh "${tool name: 'gradle', type: 'hudson.plugins.gradle.GradleInstallation'}/bin/gradle ${command}"
}
Multiple versions of Gradle
The Gradle plugin allows you to add multiple Gradle versions using different names, so your build script can target a specific version using the name
parameter of the tool
method:
def gradle(command) {
sh "${tool name: 'gradle-2.10', type: 'hudson.plugins.gradle.GradleInstallation'}/bin/gradle ${command}"
}
Implementing Git Flow
Git Flow is a branching workflow for Git which defines the following branches:
feature/*
for feature branches; merge back intodevelop
develop
for ongoing development workrelease/*
to prepare production releases; merge back intodevelop
and tagmaster
master
for production-ready releaseshotfix/*
to patchmaster
quickly; merge back intodevelop
and tagmaster
Depending on the branch name we may want to run different build steps. A clean way to do this is to create seperate methods for each type of branch and use a simple if/else to call the correct one:
def buildFeatureBranch() {
echo "Feature branch"
}
def buildDevelopBranch() {
echo "Develop branch"
}
def buildReleaseBranch() {
echo "Release branch"
}
def buildMasterBranch() {
echo "Master branch"
}
def buildHotfixBranch() {
echo "Hotfix branch"
}
node {
checkout scm
def name = env.BRANCH_NAME
if (name.startsWith('feature/')) {
buildFeatureBranch()
} else if (name == 'develop') {
buildDevelopBranch()
} else if (name.startsWith('release/')) {
buildReleaseBranch()
} else if (name == 'master') {
buildMasterBranch()
} else if (name.startsWith('hotfix/')) {
buildHotfixBranch()
} else {
error "Don't know what to do with this branch: ${name}"
}
}
Build steps
Each build step may result in a failed build, produce artifacts, etc. For example, running a test produces test reports, but also fails the build if one or more tests fail. In the next sections there are some (opinionated) implementations of build steps. On a high level, your build for a specific branch may look like this:
def buildDevelopBranch() {
test()
build()
sonar()
javadoc()
deploy(env.JBOSS_TST)
}
All implementations invoke the Gradle Wrapper via the gradle
method:
void gradle(String command) {
sh "set +x && ./gradlew ${command}"
}
Gradle Test
This captures the test-results even if the build fails.
/**
* Runs tests and archives the reports.
*/
void test() {
stage name: 'test', concurrency: 1
try {
gradle 'clean test'
} finally {
step $class: 'JUnitResultArchiver', allowEmptyResults: true, testResults: '**/build/test-results/TEST-*.xml'
}
}
Gradle Build
/**
* Builds and archives the WAR file.
*/
void build() {
stage name: 'build', concurrency: 1
gradle 'build'
archive 'build/libs/*.war'
}
SonarQube analysis
This will create a SonarQube project named after the Jenkins job name, followed by the branch name. This means any branch can be analysed without overwriting statistics of other branches.
/**
* Runs Sonar analysis.
*/
void sonar() {
def sanitize = {
it.replaceAll('[^A-Za-z0-9_:\\.\\-]', '')
}
String name = sanitize(env.JOB_NAME.split('/')[0])
String branch = sanitize(env.BRANCH_NAME.replaceAll('/','-'))
stage name: 'sonar', concurrency: 1
gradle "sonarqube -Dsonar.projectKey=${name}:${branch} -Dsonar.projectName=${name} -Dsonar.branch=${branch}"
}
Note this uses the
String.replaceAll
method which must be whitelisted in Script Approval
Gradle build configuration
This step requires the Sonar plugin in your build.gradle
file, including the URL of SonarQube:
plugins {
id 'org.sonarqube' version '1.2'
}
sonarqube {
properties {
// Override example: gradle sonarqube -Psonar=http://other.sonar.host
property 'sonar.host.url', project.properties.sonar ?: 'http://your.sonar.host'
}
}
You can also omit the sonarqube block and set sonar.host.url
in gradle.properties
:
systemProp.sonar.host.url=http://your.sonar.host
Javadoc
The standard implementation uses the javadoc
task of Gradle’s java
plugin. The results are captured by the Jenkins Javadoc Plugin and can be accessed via the Javadoc link on the job’s detail page.
/**
* Generates and archives Javadoc.
*/
void javadoc() {
gradle 'javadoc'
step $class: 'JavadocArchiver', javadocDir: 'build/docs/javadoc', keepAll: false
}
Optional Gradle build configuration
If you’re doing a multi-project build and you wish to generate a single Javadoc for all subprojects, you can redefine the javadoc task in your build.gradle
:
Note this assumes the Gradle
java
plugin is applied
javadoc {
description 'Generates and merges Javadoc API documentation for all subprojects.'
source subprojects.collect { project -> project.sourceSets.main.allJava }
classpath = files(subprojects.collect { project ->
project.sourceSets.main.compileClasspath
})
}
subprojects {
javadoc.enabled = false
}
Deploying to Wildfly using the Gradle Cargo Plugin
The simple variant that builds the WAR and deploys it to a remote Wildfly:
/**
* Performs a deployment to a remote Widlfly server, undeploying the existing if required
*/
void deployWildfly(String hostname, String username, String password) {
stage name: 'deploy', concurrency: 1
gradle "war cargoRedeployRemote -PwildflyHostname=${hostname} -PwildflyUsername=${username} -PwildflyPassword=${password}"
}
Gradle build configuration
This configuration deploys to a remote Wildfly (or JBoss EAP 7) server. It needs to download two dependencies so you need to configure both repositories
and dependencies
.
plugins {
id 'java'
id 'war'
id 'com.bmuschko.cargo' version '2.2.2'
}
repositories {
jcenter()
}
cargo {
containerId = 'wildfly9x'
// port = 9999 // wildfly9x container uses port 9990 by default
deployable {
// you can also control the context via the <context-root/> attribute in WEB-INF/jboss-web.xml
context = rootProject.name
}
remote {
hostname = project.properties.wildflyHostname ?: '192.168.99.100'
username = project.properties.wildflyUsername ?: 'admin'
password = project.properties.wildflyPassword ?: 'admin'
}
}
dependencies {
cargo 'org.wildfly.core:wildfly-controller-client:2.0.10.Final'
cargo 'org.codehaus.cargo:cargo-ant:1.4.15'
}
Capturing Wildfly’s server.log
Sometimes the deployment task fails but the Jenkins console contains no hints on what went wrong. And even if the deployement was successful, you may want to look in Wildfly’s server.log
. The following method archives the last 1.000 lines:
/**
* Archive the last 1.000 lines of the server.log for the given Wildfly server.
*/
void archiveServerLog(String hostname, String username, String password) {
def json = '{"operation":"read-attribute","address":[{"subsystem":"logging"},{"log-file":"server.log"}],"name":"stream"}'
def lines = 1000
sh "set +x && curl -s -S -L --digest http://${hostname}:9990/management?useStreamAsResponse --header 'Content-Type: application/json' -u ${username}:${password} -d '${json}' | tail -${lines} > ${hostname}.server.log"
archive "${hostname}.server.log"
}
Then you can do this, which always captures the log:
void deploy(String hostname, String username, String password) {
try {
deployWildfly(hostname, username, password)
} finally {
archiveServerLog(hostname, username, password)
}
}
Or this, which only captures it on error:
void deploy(String hostname, String username, String password) {
try {
deployWildfly(hostname, username, password)
} catch(Exception e) {
archiveServerLog(hostname, username, password)
throw e
}
}
Note the exception is re-thrown, otherwise Jenkins assumes the build succeeded.
Parallelized deployments
When you need to deploy to multiple servers, you could do that using the parallel
method which accepts a Map
of String
, Closure
:
parallel 'task1': {}, 'task2': {}
For example:
parallel [
'wildfly01t': { deploy('wildfly01t', 'user', 'password') },
'wildfly02t': { deploy('wildfly02t', 'user', 'password') }
]
Publishing artifacts to Artifactory
Artifactory is software that hosts (binary) artifacts. It is often used as a Maven repository. The following examples assume that a Maven repository called wars-release-local
has been created.
Using the Gradle Maven Plugin
void publish(String repository, String username, String password) {
stage name: 'publish', concurrency: 1
gradle "uploadArchives -Prepository=${repository} -PrepositoryUsername=${username} -PrepositoryPassword=${password}"
}
Gradle build configuration
plugins {
id 'maven'
}
configurations {
artifactory
}
dependencies {
artifactory group: 'org.apache.maven.wagon', name: 'wagon-http', version: '2.2'
}
uploadArchives {
repositories.mavenDeployer {
configuration = configurations.artifactory
repository(url: "${project.properties.repository}/wars-release-local") {
authentication(
userName: project.properties.repositoryUsername,
password: project.properties.repositoryPassword
)
}
}
}
Using the Gradle Artifactory Plugin
void publish(String repository, String username, String password) {
stage name: 'publish', concurrency: 1
gradle "artifactoryPublish -Prepository=${repository} -PrepositoryUsername=${username} -PrepositoryPassword=${password}"
}
Gradle build configuration
For more details see the official documentation.
plugins {
id 'com.jfrog.artifactory' version '4.0.0'
}
artifactory {
contextUrl = project.properties.repository
publish {
repository {
repoKey = 'wars-release-local'
username = project.properties.repositoryUsername
password = project.properties.repositoryPassword
}
}
}
Accessing credentials using the Jenkins Credentials Binding Plugin
Storing usernames and passwords in version control is really bad. Storing them in Jenkins environment variables is not much better. Jenkins has a “Credentials” database which makes it easy to re-use them and offers a tiny bit of additional protection. The withCredentials
step allows for a “safe” way to limit the exposure of these usernames and passwords in a build script.
Note you first need to install the Credentials Binding Plugin in Jenkins
withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: 'some-credentials-id',
usernameVariable: 'USERNAME', passwordVariable: 'PASSWORD']]) {
// You can now use the USERNAME and PASSWORD environment variables
}
Note the ID associated with the credential can be found by going to Credentials > Global credentials and clicking on the credential. Next, click Update and Advanced…. The ID field contains the credentials ID.
To avoid echoing the password to the console you should prefix shell scripts with set +x
to disable xtrace:
sh '''
set +x
./gradlew uploadArchives -Dusername=$USERNAME -Dpassword=$PASSWORD
'''