Role & responsibilities
Develop and Maintain Kafka Solutions: Design, implement, and manage Kafka-based data pipelines to ensure efficient data flow and processing.
Optimize Performance: Monitor and optimize Kafka clusters for high throughput and low latency.
Integration: Integrate Kafka with various systems and tools, ensuring seamless data flow.
Troubleshooting: Identify and resolve issues related to Kafka and data processing.
Documentation: Create and maintain documentation for Kafka configurations and processes.
Security and Compliance: Ensure data security and compliance with industry standards.
Skills Required:
Technical Proficiency: Strong understanding of Apache Kafka architecture, components, and ecosystem tools such as Kafka Connect and Kafka Streams.
Programming Skills: Proficiency in Java, Scala, or Python.
Distributed Systems: Experience with distributed messaging systems and real-time data processing.
Microservices Architecture: Understanding of microservices and event-driven systems.
Data Serialization: Knowledge of data serialization formats like Avro, Protobuf, or JSON.
Cloud Platforms: Familiarity with cloud platforms (AWS, Azure, Google Cloud) and their managed Kafka services.
CI/CD: Experience with CI/CD pipelines and version control tools like Git.
Monitoring Tools: Knowledge of monitoring and logging tools such as Prometheus and Grafana.
Preferred candidate profile
Keyskills: Apache Storm Kafka Cluster Apache Kafka Streams Apache Zookeeper Confluent Kafka Brokers
Virtusa is a leading worldwide provider of information technology (IT) consulting and outsourcing services. We help accelerate business outcomes for Global 2000 businesses in banking and financial services, insurance, healthcare, telecommunications and media.