Job Role: DevOps Engineer SonarQube
Client: Domestic (Delhi based client)
Experience required: To be eligible for this position, a minimum of 2-3 years of experience in Linux Administration is required.
Project duration: 1 year. Renewal : Not known
Client Onsite: Noida, Okhla phase 2. New Delhi.
Employment Type: Full-Time / Payroll ( Project based clause) /Contractual ( I year). Notice period 45 days.
Client Onsite: Noida, Okhla phase 2. New Delhi.
Job Overview
We are looking for a DevOps Engineer with 2 3 years of experience in building, managing, and automating DevOps pipelines and deployments on self-managed infrastructure.
This role demands hands-on experience from the following stack:
-
- CloudBees Jenkins
- SonarQube
- GitHub Enterprise
The candidate should be comfortable working in Linux environments, automating tasks with scripts, and configuring the DevOps ecosystem at an infrastructure and pipeline level.
Key Responsibilities & Expected Configuration Knowledge
SonarQube:
- Configure SonarQube for Java/Maven (or .NET) projects
- Generate and analyze reports on code smells, vulnerabilities, bugs
- Enforce quality gates in Jenkins using sonarScanner CLI or plugin
- Set up project-level and global rulesets
- Manage access control and authentication
Jenkins / CloudBees Jenkins:
- Design and write Jenkinsfile for pipeline-as-code (declarative or scripted)
- Create multi-branch pipelines
- Configure build triggers (SCM/webhook/cron), post-build actions, and shared libraries
- Install and configure Jenkins plugins (e.g., Git, SonarQube Scanner, Artifactory)
- Set up Jenkins agents (static or dynamic)
- Store build artifacts and test results
- Monitor and troubleshoot builds via console output and logs
GitHub Enterprise:
- Manage repositories, create branches, handle pull requests
- Configure branch protection rules and merge checks
- Implement webhook triggers to integrate with Jenkins
- Resolve merge conflicts and apply GitFlow or trunk-based workflows
Linux & Scripting:
- Navigate and manage Linux file systems
- Write Bash, Python, or PowerShell scripts for automation
- Configure log rotation and cleanup for Jenkins, SonarQube, Artifactory
- Set up reverse proxies (Nginx/Apache) if needed
- Review and troubleshoot logs in /var/log, /opt/jenkins, or containers
GitHub Enterprise:
- Manage repositories, create branches, handle pull requests
- Configure branch protection rules and merge checks
- Implement webhook triggers to integrate with Jenkins
- Resolve merge conflicts and apply GitFlow or trunk-based workflows
Linux & Scripting:
- Navigate and manage Linux file systems
- Write Bash, Python, or PowerShell scripts for automation
- Configure log rotation and cleanup for Jenkins, SonarQube, Artifactory
- Set up reverse proxies (Nginx/Apache) if needed
- Review and troubleshoot logs in /var/log, /opt/Jenkins, or containers
Tools & Technologies (Hands-on Expectation): CI/CD: Jenkins (CloudBees), GitHub Webhooks
SCM: GitHub Enterprise
Quality: SonarQube
Scripting: Bash, Python, PowerShell
OS: Linux (Ubuntu/CentOS), Windows (for .NET if applicable)
Minimum Requirements: - 2 3 years total experience
- 2+ years hands-on with any the following: SonarQube, GitHub Enterprise
- Clear understanding of DevOps workflows, not just tool usage
- Must be able to explain what they have configured and automated in each tool
Preferred Skills (Nice to Have): - Awareness of DevSecOps practices
- Experience with monitoring tools (Grafana, Prometheus, Nagios)
- Experience integrating .NET Core apps (IIS or Kestrel hosting)
Candidate Submission Instruction:
To apply, candidates must:
- Include a detailed CV that lists DevOps tools used
- Clearly explain what configurations/implementations were done by them during the project (not their team)
- An extended CV
- Or a separate email/document that describes tool-by-tool hands-on experience
Applications without actual hands-on configuration details will not be shortlisted.
Apply for this position Allowed Type(s): .pdf, .doc, .docx, .rtf By using this form you agree with the storage and handling of your data by this website.