Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Hiring For Senior Kafka Developer - Remote - Contract @ Vaco

Home > Software Development

 Hiring For Senior Kafka Developer - Remote - Contract

Job Description

Kafka Developer (7+ Years Experience)

Position Overview


We are seeking a highly skilled Kafka Developer with 7+ years of experience in designing, developing, and deploying real-time data streaming solutions. The ideal candidate will have strong expertise in Apache Kafka, distributed systems, and event-driven architecture, along with proficiency in Java/Scala/Python.


Key Responsibilities


Design, develop, and optimize Kafka-based real-time data pipelines and event-driven solutions.


Implement and maintain Kafka producers, consumers, and stream processing applications.


Build and configure Kafka Connectors, Schema Registry, and KSQL/Kafka Streams for data integration.


Manage Kafka clusters (on-premises and cloud AWS MSK, Confluent Cloud, Azure Event Hubs).


Ensure high availability, scalability, and reliability of Kafka infrastructure.


Troubleshoot and resolve issues related to Kafka performance, lag, replication, offsets, and throughput.


Implement security best practices (SSL/TLS, SASL, Kerberos, RBAC).


Collaborate with cross-functional teams (data engineers, architects, DevOps, business stakeholders).


Write unit/integration tests and ensure code quality and performance tuning.


Document solutions, best practices, and knowledge sharing within the team.


Required Skills & Experience


7+ years of software development experience, with at least 4+ years in Kafka development.


Strong hands-on experience with Apache Kafka APIs (Producer, Consumer, Streams, Connect).


Proficiency in Java or Scala (Python/Go is a plus).


Solid understanding of event-driven and microservices architecture.


Experience with serialization formats (Avro, Protobuf, JSON).


Strong knowledge of distributed systems concepts (partitioning, replication, consensus).


Experience with Confluent Platform and ecosystem tools (Schema Registry, REST Proxy, Control Center).


Exposure to cloud-based Kafka (AWS MSK, Confluent Cloud, Azure Event Hubs, GCP Pub/Sub).


Hands-on with CI/CD, Docker, Kubernetes, and monitoring tools (Prometheus, Grafana, Splunk).


Strong problem-solving skills with the ability to troubleshoot latency, lag, and cluster performance issues.

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Software Development - Other
Employement Type: Contract

Contact Details:

Company: Vaco
Location(s): Noida, Gurugram

+ View Contactajax loader


Keyskills:   Apache Zookeeper Kafka Kafka Cluster Kafka Streams Kafka Architecture

 Job seems aged, it may have been expired!
 Fraud Alert to job seekers!

₹ 20-35 Lacs P.A

Similar positions

Engineer /senior Engineer - (mcu Rtos)

  • Einfochips
  • 5 - 10 years
  • Hyderabad
  • 21 hours ago
₹ Not Disclosed

Senior Principal Technical Consultant

  • Oracle
  • 14 - 17 years
  • Hyderabad
  • 3 days ago
₹ Not Disclosed

Hiring - SAP Ariba Implementation - Hexaware Technologies

  • Hexaware Technologies
  • 7 - 12 years
  • Chennai
  • 3 days ago
₹ Not Disclosed

.NET Software Developer

  • Hexaware Technologies
  • 9 - 14 years
  • Pune
  • 3 days ago
₹ Not Disclosed

Vaco

We are taking deliberate action to nurture an inclusive culture that is grounded in our company's purpose, to refresh the world and make a difference. We act with a growth mindset, take an expansive approach to whats possible and believe in continuous learning to improve our business and ourselves.&...