Continuous Integration and Continuous Deployment (CI/CD) have become essential practices in modern software development. They enable teams to automate testing, building, and deployment processes, resulting in faster releases, fewer bugs, and more reliable software. Bitbucket Pipelines provides a powerful, integrated CI/CD solution that works seamlessly with your Bitbucket repositories.
In this comprehensive tutorial, we’ll walk through everything you need to know to set up and configure CI/CD pipelines using Bitbucket Pipelines, from basic configurations to advanced deployment strategies.
What is Bitbucket Pipelines?
Bitbucket Pipelines is Atlassian’s integrated CI/CD solution that runs builds and deployments directly within your Bitbucket repositories. It uses Docker containers to provide isolated, reproducible build environments, and it’s configured through a simple YAML file in your repository.
Key Benefits
- Integrated: Works directly with your Bitbucket repository
- Docker-based: Uses Docker containers for consistent build environments
- Flexible: Supports any language or framework
- Cost-effective: Free tier available with reasonable limits
- Easy to configure: Simple YAML-based configuration
Prerequisites
Before we begin, make sure you have:
- A Bitbucket account (free tier works fine)
- A repository with your project code
- Basic knowledge of Git and YAML syntax
- Understanding of your project’s build and deployment process
Step 1: Enable Bitbucket Pipelines
The first step is to enable Pipelines for your repository:
- Navigate to your Bitbucket repository
- Click on the Pipelines tab in the left sidebar
- Click Enable Pipelines button
- You’ll see a prompt to create your first pipeline configuration
Step 2: Understanding the Pipeline Configuration
Bitbucket Pipelines uses a file named bitbucket-pipelines.yml in the root of your repository to define your CI/CD workflow. This YAML file specifies:
- Which Docker image to use
- What steps to run
- When to run them (on which branches)
- What commands to execute
Basic Structure
image: atlassian/default-image:latest
pipelines:
default:
- step:
name: Build and Test
script:
- echo "Building application..."
- echo "Running tests..."
Step 3: Creating Your First Pipeline
Let’s create a practical example. We’ll build a pipeline for a Node.js application:
Example 1: Node.js Application
image: node:18
pipelines:
default:
- step:
name: Install Dependencies
caches:
- node
script:
- npm install
- step:
name: Run Tests
script:
- npm test
- step:
name: Build Application
script:
- npm run build
artifacts:
- dist/**
Key Concepts:
image: Specifies the Docker image (Node.js 18 in this case)caches: Caches dependencies to speed up buildsscript: Commands to executeartifacts: Files to save for later steps
Example 2: Python Application
image: python:3.11
pipelines:
default:
- step:
name: Setup and Test
caches:
- pip
script:
- pip install -r requirements.txt
- pytest
- step:
name: Build Package
script:
- python setup.py sdist bdist_wheel
artifacts:
- dist/**
Example 3: Docker Application
image: docker:latest
pipelines:
default:
- step:
name: Build Docker Image
services:
- docker
script:
- docker build -t myapp:$BITBUCKET_COMMIT .
- docker tag myapp:$BITBUCKET_COMMIT myapp:latest
Step 4: Branch-Specific Pipelines
You can configure different pipelines for different branches. This is useful for separating development, staging, and production deployments:
image: node:18
pipelines:
default:
- step:
name: Build and Test
script:
- npm install
- npm test
- npm run build
branches:
develop:
- step:
name: Deploy to Staging
script:
- npm install
- npm run build
- ./deploy-staging.sh
master:
- step:
name: Build
script:
- npm install
- npm test
- npm run build
- step:
name: Deploy to Production
deployment: production
script:
- ./deploy-production.sh
after-script:
- echo "Deployment completed"
Step 5: Advanced Configuration
Using Environment Variables
Store sensitive information like API keys and passwords in Bitbucket’s repository settings:
- Go to Repository Settings → Pipelines → Repository variables
- Add your variables (they’ll be encrypted)
- Use them in your pipeline:
image: node:18
pipelines:
default:
- step:
script:
- echo $API_KEY
- echo $DATABASE_URL
Parallel Steps
Run multiple steps in parallel to speed up your pipeline:
image: node:18
pipelines:
default:
- parallel:
- step:
name: Unit Tests
script:
- npm test
- step:
name: Linting
script:
- npm run lint
- step:
name: Type Checking
script:
- npm run type-check
Conditional Steps
Run steps conditionally based on conditions:
image: node:18
pipelines:
default:
- step:
name: Build
script:
- npm install
- npm run build
- step:
name: Deploy
condition:
changesets:
includePaths:
- src/**
script:
- ./deploy.sh
Step 6: Deployment Examples
Deploying to AWS S3
image: node:18
pipelines:
branches:
master:
- step:
name: Build and Deploy
script:
- npm install
- npm run build
- apt-get update && apt-get install -y awscli
- aws s3 sync dist/ s3://my-bucket-name --delete
Deploying to Heroku
image: node:18
pipelines:
branches:
master:
- step:
name: Deploy to Heroku
script:
- git push https://heroku:[email protected]/my-app.git HEAD:master
Deploying to AWS ECS
image: docker:latest
pipelines:
branches:
master:
- step:
name: Build and Push to ECR
services:
- docker
script:
- docker build -t $ECR_REPO:$BITBUCKET_COMMIT .
- docker tag $ECR_REPO:$BITBUCKET_COMMIT $ECR_REPO:latest
- pip install awscli
- aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin $ECR_REPO
- docker push $ECR_REPO:$BITBUCKET_COMMIT
- docker push $ECR_REPO:latest
- step:
name: Deploy to ECS
script:
- pip install awscli
- aws ecs update-service --cluster my-cluster --service my-service --force-new-deployment
Deploying via SSH
image: atlassian/default-image:latest
pipelines:
branches:
master:
- step:
name: Deploy via SSH
script:
- apt-get update && apt-get install -y openssh-client
- mkdir -p ~/.ssh
- echo "$SSH_PRIVATE_KEY" > ~/.ssh/id_rsa
- chmod 600 ~/.ssh/id_rsa
- ssh-keyscan -H $SERVER_HOST >> ~/.ssh/known_hosts
- ssh $SERVER_USER@$SERVER_HOST "cd /var/www/app && git pull && npm install && pm2 restart app"
Step 7: Best Practices
1. Use Caching
Cache dependencies to speed up builds:
- step:
caches:
- node
- pip
script:
- npm install
2. Use Artifacts
Pass files between steps:
- step:
name: Build
script:
- npm run build
artifacts:
- dist/**
- step:
name: Deploy
script:
- ls -la dist/
3. Set Timeouts
Prevent long-running builds:
- step:
max-time: 10
script:
- npm test
4. Use Specific Docker Images
Use specific versions instead of latest:
image: node:18.17.0 # Instead of node:latest
5. Organize with Step Names
Use descriptive step names:
- step:
name: Install Dependencies and Run Tests
script:
- npm install
- npm test
6. Handle Failures Gracefully
- step:
script:
- npm test || echo "Tests failed but continuing..."
- npm run build
Step 8: Monitoring and Debugging
Viewing Pipeline Logs
- Go to the Pipelines tab in your repository
- Click on a specific pipeline run
- Click on individual steps to see detailed logs
Common Issues and Solutions
Issue: Pipeline times out
- Solution: Increase timeout or optimize build steps
Issue: Dependencies not found
- Solution: Check cache configuration and ensure dependencies are installed
Issue: Permission denied
- Solution: Check file permissions and SSH key configuration
Issue: Environment variables not working
- Solution: Verify variables are set in repository settings and spelled correctly
Step 9: Complete Example: Full-Stack Application
Here’s a complete example for a full-stack application with frontend and backend:
image: node:18
definitions:
steps:
- step: &build-frontend
name: Build Frontend
caches:
- node
script:
- cd frontend
- npm install
- npm run build
artifacts:
- frontend/dist/**
- step: &test-backend
name: Test Backend
caches:
- node
script:
- cd backend
- npm install
- npm test
- step: &deploy-staging
name: Deploy to Staging
deployment: staging
script:
- ./scripts/deploy-staging.sh
- step: &deploy-production
name: Deploy to Production
deployment: production
script:
- ./scripts/deploy-production.sh
pipelines:
default:
- step: *build-frontend
- step: *test-backend
branches:
develop:
- step: *build-frontend
- step: *test-backend
- step: *deploy-staging
master:
- step: *build-frontend
- step: *test-backend
- step: *deploy-production
Step 10: Security Considerations
1. Never Commit Secrets
Always use repository variables for sensitive data:
# ❌ Bad
script:
- echo "password123"
# ✅ Good
script:
- echo $SECURE_PASSWORD
2. Use Deployment Environments
Configure deployment environments for better security:
- step:
deployment: production
script:
- ./deploy.sh
3. Limit Access
Use branch permissions to control who can trigger deployments:
- Go to Repository Settings → Branch permissions
- Set restrictions on production branches
Troubleshooting Tips
- Test locally first: Use Docker to test your pipeline configuration locally
- Check logs carefully: Pipeline logs provide detailed error messages
- Start simple: Begin with a basic pipeline and add complexity gradually
- Use the pipeline editor: Bitbucket provides a visual editor for creating pipelines
- Check Docker image tags: Ensure you’re using valid Docker image tags
Next Steps
Now that you have a working CI/CD pipeline, consider:
- Adding notifications: Configure Slack or email notifications for pipeline results
- Code quality checks: Add linting, code coverage, and security scanning
- Multi-stage deployments: Implement blue-green or canary deployments
- Performance monitoring: Track build times and optimize slow steps
- Integration testing: Add end-to-end tests to your pipeline
Conclusion
Bitbucket Pipelines provides a powerful and flexible way to implement CI/CD for your projects. By following this tutorial, you’ve learned how to:
- Enable and configure Bitbucket Pipelines
- Create pipelines for different types of applications
- Deploy to various platforms
- Implement best practices for security and performance
Remember, CI/CD is an iterative process. Start with a simple pipeline and gradually add more features as your needs grow. The key is to automate repetitive tasks and catch issues early in the development process.
Happy building! 🚀