Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Senior Software Engineer @ Optum

Home > Software Development






 Senior Software Engineer

Job Description

Primary Responsibilities:

  • Manage the Hardware and Software for Kafka and its ecosystem components Work with Application teams to gather future requirements to plan the growth of the infrastructure and expand as needed. Implement Disaster Recovery (DR) for Kafka
  • Build Automation Capabilities using tools like Terraform , Ansible, Git , Jenkins, etc. Kafka security (Kerberos, ACL, SSL, SASL, SCRAM, etc.)
  • Research and implement the new capabilities for Enterprise Messaging Services Requirements
  • Design, build, assemble, and configure application or technical architecture components using business requirements
  • Establish best practice standards for configuring Source and Sink connectors
  • Demonstrate a product mindset with an ability to set forward thinking and direction
  • Be able to synthesize large amounts of complex data into meaningful conclusions and present recommendations
  • Be able to maintain a positive attitude while working with high demands and short deadlines that leads to working after hours
  • Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so

Required Qualifications:

  • Undergraduate degree or equivalent experience
  • Hands-on experience with Kafka clusters hosted in GCP and on-prem K8 platform
  • Hands-on experience with AI technologies
  • Hands-on experience in standing up and administrating Kafka platform from scratch which includes creating a backup & mirroring of Kafka Cluster brokers, broker sizing, topic sizing, h/w sizing, performance monitoring, broker security, topic security, consumer/producer access management (ACL)
  • Experience with Kafka clusters, Kubernetes, Terraform or Helm Charts
  • Experience with Linux (RHEL) /Unix
  • Experience in building Kafka pipelines using Terraform, Ansible, Cloud formation templates, shells etc.
  • Experience in implementing security & authorization (permission based) on Kafka cluster
  • Experience in System Administrators with setting up Kafka platform in provisioning, access lists Kerberos and SSL configurations
  • Experience in setting standards to automate deployments using Kubernetes, Docker, Jenkins Experience in open source and confluence Kafka, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center
  • Experience in Kafka Mirror Maker or Confluent Replicator Experience in High availability cluster setup, maintenance
  • Experience in setting up Promethes Grafana or ELK monitoring tools
  • Hands-on experience with AI technologies
  • Hands-on experience with Kafka clusters hosted in GCP and on-prem K8 platform
  • Knowledge of Kafka API (development experience)
  • Knowledge of best practices related to security, performance, and disaster recovery
  • Thorough understanding of Kafka, producer/consumer/topic technologies and drive implementation of the technology
  • Understanding of or experience with Programming languages like Python, etc.
  • Understand implications of data upstream and downstream
  • Proven ability to concentrate on a wide range of loosely defined complex situations, which require creativity and originality, where guidance and counsel may be unavailable
  • Proven excellent communications and interpersonal skills

Preferred Qualifications:

  • Experience in setting up Promethes Grafana or ELK monitoring tools
  • Experience as Linux (RHEL) /Unix administrator
  • Experience in PostgreSQL, SQL Server, No-SQL (Hbase), Oracle, and GCP/Azure Cloud
  • Experience with any RDBMS, No-SQL technologies, and GCP/Azure Cloud
  • Understanding of or experience with Programming languages like Python, etc.

Job Classification

Industry: Retail
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Data Platform Engineer
Employement Type: Full time

Contact Details:

Company: Optum
Location(s): Noida, Gurugram

+ View Contactajax loader


Keyskills:   software engineer kubernetes rest python confluence oracle ai sql server sql docker ansible build automation git automation postgresql gcp grafana kafka linux jenkins terraform programming azure

 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Custom Software Engineering Lead

  • Accenture
  • 2 - 5 years
  • Kolkata
  • 12 hours ago
₹ Not Disclosed

.NET Software Developer

  • Accenture
  • 6 - 11 years
  • Bengaluru
  • 17 hours ago
₹ 11-17 Lacs P.A.

Lead Software Engineer

  • Capgemini
  • 5 - 8 years
  • Hyderabad
  • 18 hours ago
₹ Not Disclosed

Lead Software Engineer

  • Capgemini
  • 5 - 8 years
  • Pune
  • 20 hours ago
₹ Not Disclosed

Optum

Naukri E-hire Campaign