How to Use Jenkins Pipeline

How to Use Jenkins Pipeline Jenkins Pipeline is a powerful, code-driven approach to defining continuous integration and continuous delivery (CI/CD) workflows. Unlike traditional Jenkins jobs that rely on GUI-based configuration, Jenkins Pipeline allows teams to define their entire build, test, and deployment processes as code—stored in version control alongside the application itself. This paradig

Nov 6, 2025 - 10:11
Nov 6, 2025 - 10:11
 0

How to Use Jenkins Pipeline

Jenkins Pipeline is a powerful, code-driven approach to defining continuous integration and continuous delivery (CI/CD) workflows. Unlike traditional Jenkins jobs that rely on GUI-based configuration, Jenkins Pipeline allows teams to define their entire build, test, and deployment processes as codestored in version control alongside the application itself. This paradigm shift enables greater consistency, repeatability, and collaboration across development, operations, and QA teams. By leveraging a domain-specific language (DSL) based on Groovy, Jenkins Pipeline offers flexibility, scalability, and auditability that traditional job configurations simply cannot match.

As organizations increasingly adopt DevOps practices, the need for reliable, automated, and transparent CI/CD pipelines has never been greater. Jenkins Pipeline addresses this need by providing a unified, version-controlled mechanism to orchestrate complex workflows across multiple environments. Whether youre deploying a simple web application or managing microservices across hybrid cloud infrastructures, Jenkins Pipeline empowers teams to automate every stage of the software delivery lifecycle with precision and control.

This tutorial provides a comprehensive, step-by-step guide to using Jenkins Pipelinefrom initial setup to advanced best practices. Youll learn how to write, test, and maintain production-grade pipelines, integrate with essential tools, and apply real-world patterns that leading engineering teams use daily. By the end of this guide, youll have the knowledge and confidence to implement Jenkins Pipeline in your own environment, regardless of your prior experience.

Step-by-Step Guide

Prerequisites and Environment Setup

Before diving into Jenkins Pipeline, ensure your environment is properly configured. First, install Jenkins on a server or container that meets the minimum system requirements. Jenkins recommends at least 2 CPU cores and 4 GB of RAM for production use, though development environments can run on lower specifications. You can install Jenkins via package managers (like apt or yum), Docker, or using the official WAR file.

Once Jenkins is installed, access the web interface through your browser at http://your-server:8080. Complete the initial setup by unlocking Jenkins using the admin password found in /var/lib/jenkins/secrets/initialAdminPassword (Linux) or the corresponding location on your OS. Install the recommended plugins during setup, especially Pipeline, Git, Blue Ocean, and Pipeline Utility Steps. These plugins provide the core functionality needed to write, visualize, and manage pipelines.

Next, configure a Jenkins user with appropriate permissions. For security, avoid running Jenkins as root. Instead, create a dedicated system user and assign it ownership of the Jenkins home directory. Ensure that the Jenkins user has read/write access to your source code repositories and deployment targets. If you're using Git, generate an SSH key pair and add the public key to your Git hosting service (GitHub, GitLab, Bitbucket). Then, in Jenkins, navigate to Manage Jenkins > Credentials > System > Global credentials and add the private key as a SSH Username with private key credential. Note the credential IDit will be referenced in your pipeline scripts.

Creating Your First Pipeline

To create a new pipeline, click New Item on the Jenkins dashboard. Enter a meaningful name (e.g., my-app-ci-cd) and select Pipeline. Click OK. On the configuration page, youll see several options under the Pipeline section. For this guide, well begin with the Pipeline script option, which allows you to write the entire pipeline directly in the Jenkins UI.

Copy and paste the following basic pipeline script into the script box:

pipeline {

agent any

stages {

stage('Checkout') {

steps {

checkout scm

}

}

stage('Build') {

steps {

sh 'mvn clean package'

}

}

stage('Test') {

steps {

sh 'mvn test'

}

}

stage('Deploy') {

steps {

sh 'echo "Deploying to staging..."'

}

}

}

}

This is a minimal but functional pipeline. Lets break it down:

  • pipeline The root block that defines the entire workflow.
  • agent any Tells Jenkins to run this pipeline on any available agent (node). You can specify labels like agent { label 'linux' } to target specific machines.
  • stages A container for all the major phases of your pipeline.
  • stage Each stage represents a logical step, such as checkout, build, test, or deploy.
  • steps The actual commands executed within each stage.

Click Save, then click Build Now. Jenkins will execute the pipeline. If your project is a Maven-based Java application, youll see the build succeed (assuming Maven is installed on the agent). If not, youll get an errordont worry, well fix that in the next section.

Using Jenkinsfile and Version Control

While writing pipelines directly in the Jenkins UI is useful for testing, its not suitable for production. The industry standard is to store your pipeline definition in a file called Jenkinsfile at the root of your source code repository. This enables version control, code reviews, and collaboration.

Create a file named Jenkinsfile in your projects root directory and paste the same pipeline script into it. Commit and push this file to your Git repository. Now, return to your Jenkins job configuration. Under Pipeline, change the definition from Pipeline script to Pipeline script from SCM. Select Git as the source code management system, enter your repository URL, and choose the credential you configured earlier. Set the Script Path to Jenkinsfile.

Save the configuration and trigger a new build. Jenkins will now clone your repository, locate the Jenkinsfile, and execute the pipeline defined within it. This approach ensures that your pipeline evolves alongside your codebase. Any changes to the pipeline are tracked, reviewed, and audited just like application code.

Understanding Declarative vs. Scripted Pipeline Syntax

Jenkins Pipeline supports two syntax styles: Declarative and Scripted. The example above uses Declarative Pipeline, which is recommended for most use cases due to its structured, readable format and built-in error handling.

Declarative Pipeline enforces a strict structure with predefined sections like pipeline, agent, stages, steps, and post. Its ideal for standard CI/CD workflows and integrates seamlessly with Jenkins Blue Ocean UI for visual pipeline rendering.

Scripted Pipeline, on the other hand, uses a more flexible, Groovy-based syntax. Its written inside a node block and allows full access to Groovys programming features. Heres an equivalent Scripted Pipeline:

node {

stage('Checkout') {

checkout scm

}

stage('Build') {

sh 'mvn clean package'

}

stage('Test') {

sh 'mvn test'

}

stage('Deploy') {

sh 'echo "Deploying to staging..."'

}

}

While Scripted Pipeline offers more power and flexibility, it lacks the built-in structure and error recovery features of Declarative Pipeline. For beginners and most enterprise teams, Declarative is the clear choice. Use Scripted only if you need complex logic, dynamic stages, or advanced Groovy features.

Working with Agents and Labels

Jenkins can distribute work across multiple machines called agents (formerly slaves). To scale your CI/CD infrastructure, configure multiple agents with different capabilitiese.g., one for Linux builds, another for Windows testing, and a third for Docker-based deployments.

To label an agent, go to Manage Jenkins > Nodes, select an agent, and under Labels, enter a comma-separated list like linux docker maven. Then, in your pipeline, specify the agent using a label:

pipeline {

agent { label 'linux && maven' }

stages {

stage('Build') {

steps {

sh 'mvn clean package'

}

}

}

}

This ensures the pipeline runs only on agents that have both the linux and maven labels. You can also use multiple agents in a single pipeline:

pipeline {

agent none

stages {

stage('Build on Linux') {

agent { label 'linux' }

steps {

sh 'mvn clean package'

}

}

stage('Test on Windows') {

agent { label 'windows' }

steps {

bat 'mvn test'

}

}

}

}

Using agent none at the top level allows you to define agent requirements per stage, giving you fine-grained control over where each step runs.

Integrating with External Tools

Jenkins Pipeline integrates seamlessly with a wide range of tools. Here are common integrations:

Git and GitHub

Use the checkout step to clone your repository:

steps {

checkout([$class: 'GitSCM',

branches: [[name: '*/main']],

doGenerateSubmoduleConfigurations: false,

extensions: [],

userRemoteConfigs: [[url: 'https://github.com/your-org/your-repo.git',

credentialsId: 'github-ssh-key']]])

}

Alternatively, use the shorthand checkout scm if your pipeline is configured to pull from SCM.

Docker

To build and push Docker images, install the Docker Pipeline plugin. Then use:

stage('Build Docker Image') {

steps {

script {

docker.build("my-app:${env.BUILD_ID}")

}

}

}

To push to a registry:

stage('Push to Registry') {

steps {

script {

docker.withRegistry('https://registry.hub.docker.com', 'docker-hub-credentials') {

docker.image("my-app:${env.BUILD_ID}").push()

}

}

}

}

Artifactory

Use the Artifactory plugin to upload build artifacts:

stage('Upload to Artifactory') {

steps {

script {

def server = Artifactory.newServer url: 'https://your-artifactory.com', credentialsId: 'artifactory-creds'

def buildInfo = server.publishBuildInfo()

server.upload spec: """{

"files": [

{

"pattern": "target/*.jar",

"target": "my-repo/local/"

}

]

}"""

}

}

}

Slack Notifications

Install the Slack Notification plugin and configure a webhook. Then send messages:

stage('Notify Slack') {

steps { slackSend color: 'good', message: "Build ${env.JOB_NAME}

${env.BUILD_NUMBER} succeeded! ${env.BUILD_URL}"

}

}

These integrations make Jenkins Pipeline a true orchestration engine capable of managing end-to-end workflows across your toolchain.

Handling Failures and Recovery

Robust pipelines must handle failures gracefully. Jenkins provides the post section to define actions that run after the pipeline completes, regardless of success or failure.

post {

always {

echo 'Cleaning up workspace...'

cleanWs()

}

success { slackSend color: 'good', message: "Build succeeded: ${env.JOB_NAME}

${env.BUILD_NUMBER}"

}

failure { slackSend color: 'danger', message: "Build failed: ${env.JOB_NAME}

${env.BUILD_NUMBER}"

error 'Pipeline failed. Check logs for details.'

}

unstable {

echo 'Tests failed, but build is still marked as unstable.'

}

}

The always block runs in all casesideal for cleanup tasks like deleting temporary files or archiving logs. The failure and success blocks allow you to send notifications, trigger rollbacks, or archive artifacts conditionally.

You can also use try/catch blocks within stages for fine-grained error handling:

stage('Run Integration Tests') {

steps {

script {

try {

sh 'mvn verify'

} catch (Exception e) {

currentBuild.result = 'UNSTABLE'

echo "Integration tests failed: ${e.message}"

}

}

}

}

This approach lets you mark a build as unstable (yellow) instead of failed (red), allowing subsequent stages to continueuseful for reporting test coverage even when tests fail.

Parameterizing Pipelines

Dynamic pipelines accept input parameters to customize behavior without changing code. Add parameters to your pipeline like this:

parameters {

choice(name: 'ENV', choices: ['staging', 'production'], description: 'Target environment')

string(name: 'TAG', defaultValue: 'latest', description: 'Docker image tag')

}

These parameters appear as form fields when you click Build with Parameters. Access them in your script using params.ENV or params.TAG:

stage('Deploy') {

steps {

sh "deploy.sh --env ${params.ENV} --tag ${params.TAG}"

}

}

Parameterization is essential for reusable pipelines that serve multiple environments or configurations.

Best Practices

Keep Pipelines Idempotent and Repeatable

A reliable pipeline should produce the same outcome every time it runs, given the same inputs. Avoid hardcoding paths, credentials, or environment-specific values. Use environment variables, credentials stores, and configuration files instead. Always clean up temporary files and artifacts before starting a new build. Use the cleanWs() step to wipe the workspace before checkout.

Use Meaningful Stage Names

Clear, descriptive stage names improve readability and troubleshooting. Instead of stage('Step 1'), use stage('Run Unit Tests'). This helps engineers quickly identify where a failure occurred, especially in complex pipelines with dozens of stages.

Break Down Large Pipelines

As pipelines grow, they become harder to maintain. Use the load step to import shared Groovy libraries:

pipeline {

agent any

stages {

stage('Setup') {

steps {

script {

def ciLib = load 'vars/ci-library.groovy'

ciLib.setup()

}

}

}

stage('Build') {

steps {

script {

def ciLib = load 'vars/ci-library.groovy'

ciLib.build()

}

}

}

}

}

Define reusable functions in vars/ci-library.groovy and version them alongside your code. This promotes DRY principles and reduces duplication.

Implement Security Best Practices

Never store secrets in your Jenkinsfile. Use Jenkins Credentials Binding to inject sensitive data securely:

steps {

withCredentials([string(credentialsId: 'aws-access-key', variable: 'AWS_ACCESS_KEY')]) {

sh 'aws s3 cp target/app.jar s3://my-bucket/'

}

}

Also, restrict access to Jenkins jobs using role-based authentication. Avoid granting Admin permissions to developers. Use the Role Strategy plugin to assign granular permissions like Build or Configure based on team roles.

Enable Pipeline Validation and Linting

Before committing your Jenkinsfile, validate it using the Jenkins Pipeline Syntax Checker. In Jenkins, go to any pipeline job, click Pipeline Syntax, and paste your script into the snippet generator. Alternatively, use the pipeline-utility-steps plugin to validate syntax programmatically:

stage('Validate Jenkinsfile') {

steps {

script {

def result = validatePipeline script: readFile('Jenkinsfile')

if (!result.valid) {

error "Pipeline syntax error: ${result.errors}"

}

}

}

}

Integrate this into your pull request workflow to catch errors early.

Monitor and Log Everything

Enable detailed logging and integrate with centralized monitoring tools like ELK Stack or Datadog. Use echo and println liberally to trace pipeline execution. Avoid silent failures. Log build artifacts, environment variables, and timestamps. This makes debugging far easier when things go wrong.

Version Control Your Pipeline

Your Jenkinsfile is code. Treat it as such. Require code reviews for all changes. Use branching strategies (e.g., Git Flow) to test pipeline changes in feature branches before merging to main. Never edit pipelines directly in the Jenkins UI for production jobsalways use SCM.

Use Blue Ocean for Visualization

Install the Blue Ocean plugin to get a modern, intuitive UI for viewing pipelines. Blue Ocean renders your pipeline as a visual timeline, highlights failures, and provides one-click access to logs. Its especially helpful for non-technical stakeholders who need to understand CI/CD progress.

Tools and Resources

Essential Jenkins Plugins

These plugins significantly enhance Jenkins Pipeline capabilities:

  • Blue Ocean Modern UI for visualizing and debugging pipelines.
  • Pipeline Utility Steps Provides useful functions like readJSON, writeJSON, findFiles, and validatePipeline.
  • Docker Pipeline Enables building, tagging, and pushing Docker images directly from pipelines.
  • Git Core plugin for source code checkout and integration.
  • Artifactory Integrates with JFrog Artifactory for artifact management.
  • Slack Notification Sends real-time build status updates to Slack channels.
  • Role Strategy Plugin Enables fine-grained access control for teams and roles.
  • Parameterized Trigger Allows triggering downstream pipelines with custom parameters.
  • EnvInject Loads environment variables from files or scripts.

External Tools and Services

Complement your Jenkins Pipeline with these tools:

  • GitHub Actions / GitLab CI Consider using them for simpler projects; Jenkins excels in complex, hybrid environments.
  • Docker Containerize your build environments to ensure consistency across agents.
  • Ansible / Terraform Use them in deployment stages to provision infrastructure.
  • SonarQube Integrate static code analysis into your pipeline for quality gates.
  • Prometheus + Grafana Monitor pipeline performance metrics like build duration and failure rates.
  • Alertmanager Trigger alerts via email or Slack when pipelines fail repeatedly.

Learning Resources

Deepen your understanding with these official and community resources:

Sample Repositories to Study

Explore these open-source projects with well-structured Jenkins Pipelines:

Real Examples

Example 1: Java Spring Boot Application

Heres a complete, production-ready pipeline for a Spring Boot microservice:

pipeline {

agent any

parameters {

choice(name: 'ENV', choices: ['dev', 'staging', 'prod'], description: 'Deployment environment')

string(name: 'IMAGE_TAG', defaultValue: 'latest', description: 'Docker image tag')

}

environment {

DOCKER_REGISTRY = 'docker.io/your-org'

APP_NAME = 'my-spring-app'

}

stages {

stage('Checkout') {

steps {

checkout scm

}

}

stage('Lint & Analyze') {

steps {

sh 'mvn compile'

sh 'mvn checkstyle:checkstyle'

sh 'mvn spotbugs:check'

}

}

stage('Build') {

steps {

sh 'mvn clean package -DskipTests'

}

}

stage('Unit Tests') {

steps {

sh 'mvn test'

}

}

stage('Build Docker Image') {

steps {

script {

def image = docker.build("${env.DOCKER_REGISTRY}/${env.APP_NAME}:${params.IMAGE_TAG}")

}

}

}

stage('Push to Registry') {

steps {

script {

docker.withRegistry('https://registry.hub.docker.com', 'docker-hub-creds') {

docker.image("${env.DOCKER_REGISTRY}/${env.APP_NAME}:${params.IMAGE_TAG}").push()

}

}

}

}

stage('Deploy to Staging') {

when {

environment name: 'ENV', value: 'staging'

}

steps {

sh 'kubectl set image deployment/my-app my-app=${env.DOCKER_REGISTRY}/${env.APP_NAME}:${params.IMAGE_TAG} --namespace=staging'

}

}

stage('Run Integration Tests') {

when {

environment name: 'ENV', value: 'staging'

}

steps {

sh 'curl -f http://my-app.staging.example.com/actuator/health'

}

}

stage('Notify Slack') {

steps { slackSend color: 'good', message: "? ${env.JOB_NAME}

${env.BUILD_NUMBER} deployed to ${params.ENV} with tag ${params.IMAGE_TAG}"

}

}

}

post {

always {

cleanWs()

archiveArtifacts artifacts: 'target/*.jar', allowEmptyArchive: true

}

failure { slackSend color: 'danger', message: "? ${env.JOB_NAME}

${env.BUILD_NUMBER} failed. Check logs."

}

}

}

This pipeline includes:

  • Parameterized environment selection
  • Code linting and static analysis
  • Docker build and push
  • Conditional deployment based on environment
  • Integration test validation
  • Artifact archiving and Slack notifications

Example 2: Multi-Branch Pipeline for Feature Development

Use the Multibranch Pipeline job type to automatically create pipelines for every Git branch. This is ideal for teams practicing feature branching.

Configure a Multibranch Pipeline job to point to your repository. Jenkins will automatically detect branches with a Jenkinsfile and create individual pipelines for each.

Use a conditional stage to skip deployment on feature branches:

stage('Deploy to Production') {

when {

branch 'main'

environment name: 'CI', value: 'true'

}

steps {

sh './deploy-prod.sh'

}

}

Now, every pull request triggers a build and test on its branch, but only merges to main trigger production deployment.

Example 3: CI/CD for Node.js Application with Cypress

pipeline {

agent { docker { image 'node:18-alpine' } }

stages {

stage('Install Dependencies') {

steps {

sh 'npm ci'

}

}

stage('Run Linter') {

steps {

sh 'npm run lint'

}

}

stage('Run Unit Tests') {

steps {

sh 'npm test'

}

}

stage('Build') {

steps {

sh 'npm run build'

}

}

stage('Run E2E Tests') {

steps {

script {

sh 'npx cypress run --headless'

}

}

}

stage('Deploy to S3') {

steps {

withCredentials([string(credentialsId: 'aws-creds', variable: 'AWS_CREDENTIALS')]) {

sh 'aws s3 sync build/ s3://my-website-bucket/ --delete'

}

}

}

}

post {

always {

publishHTML(target: [

reportDir: 'cypress/reports/html',

reportFiles: 'index.html',

reportName: 'Cypress Test Report'

])

}

}

}

This example demonstrates:

  • Using Docker containers for consistent environments
  • Running end-to-end tests with Cypress
  • Generating and publishing HTML test reports
  • Deploying static assets to S3

FAQs

What is the difference between Jenkins Pipeline and Freestyle Jobs?

Jenkins Pipeline defines workflows as code in a Jenkinsfile, stored in version control. Freestyle jobs are configured through the Jenkins UI and are not version-controlled. Pipelines are more scalable, reusable, and auditable. Freestyle jobs are simpler for one-off tasks but lack the structure and automation benefits of pipelines.

Can Jenkins Pipeline run on multiple agents simultaneously?

Yes. Using agent none at the pipeline level and defining agents per stage allows different stages to run on different machines. This is essential for parallel testing across platforms (e.g., Linux, Windows, macOS).

How do I pass variables between stages in a Jenkins Pipeline?

Use the script block to assign values to variables in the environment or script scope. For example:

def version = '1.0.0'

stage('Build') {

steps {

script {

version = sh(script: 'git describe --tags', returnStdout: true).trim()

}

}

}

stage('Deploy') {

steps {

sh "deploy --version ${version}"

}

}

How do I handle secrets securely in Jenkins Pipeline?

Never hardcode secrets. Use Jenkins Credentials Binding with withCredentials to inject secrets as environment variables. Store credentials in the Jenkins Credentials Store using types like Username and password, Secret text, or SSH private key.

Can I trigger a Jenkins Pipeline from a GitHub pull request?

Yes. Install the GitHub Plugin and configure a webhook in your GitHub repository. Then, use the GitHub Pull Request Builder plugin or configure your Multibranch Pipeline to trigger on PR events. This enables automated testing for every pull request.

What happens if a stage fails in a Jenkins Pipeline?

By default, the pipeline stops execution. You can override this behavior using catchError or by setting currentBuild.result to 'UNSTABLE' to allow subsequent stages to run. Use the post section to handle cleanup and notifications regardless of outcome.

Is Jenkins Pipeline suitable for serverless or cloud-native applications?

Absolutely. Jenkins Pipeline integrates with Kubernetes, AWS Lambda, Azure Functions, and Google Cloud Run. You can use Docker containers as agents, deploy Helm charts, and trigger serverless functionsall within a single pipeline.

How do I debug a failing Jenkins Pipeline?

Use the Blue Ocean UI for visual debugging. Check the console output for error messages. Add echo statements to log variable values. Use the Pipeline Syntax tool to validate steps. Run the pipeline locally using the Jenkins Pipeline Unit Testing framework if possible.

Conclusion

Jenkins Pipeline transforms CI/CD from a series of manual, GUI-driven tasks into a streamlined, automated, and version-controlled process. By writing your build, test, and deployment logic as code, you gain unprecedented control, transparency, and scalability. Whether youre deploying a simple static site or managing a fleet of microservices across hybrid clouds, Jenkins Pipeline provides the foundation for reliable, repeatable software delivery.

This guide has walked you through every critical aspectfrom setting up your first pipeline to integrating with Docker, Kubernetes, and external tools. Youve learned how to structure pipelines for maintainability, handle failures gracefully, and apply industry best practices that top engineering teams rely on daily.

The key to success lies not just in mastering syntax, but in cultivating a culture of automation, collaboration, and continuous improvement. Start smallconvert one manual job into a Jenkinsfile. Then expand. Add tests. Add notifications. Add deployments. Iterate. Over time, your pipeline will evolve into a powerful engine that accelerates delivery, reduces errors, and empowers your entire team.

Jenkins Pipeline is more than a toolits a mindset. Embrace it, refine it, and let it become the backbone of your DevOps journey.