Terraform for Infrastructure as Code (good to have)
Cloud Functions and event-driven architecture
Roles & Responsibilities:
Develop and maintain scalable, cloud-native data processing pipelines on GCP.
Work extensively on BigQuery, DataFlow, Pub/Sub, Cloud Storage, and Airflow for orchestration.
Automate infrastructure using Terraform and follow agile development practices.
Implement data solutions for enterprise-scale data lakes and data warehouses.
Write clean, efficient, and production-ready code using Python and SQL.
Handle data quality issues including data duplication and debugging code failures.
Collaborate with cross-functional teams to build resilient and reliable data platforms.
Additional Screening Points:
Hands-on experience with debugging/analyzing production issues
Exposure to identifying and resolving data duplication or consistency issues
Should have worked on end-to-end data pipeline creation and monitoring
Job Classification
Industry: IT Services & ConsultingFunctional Area / Department: Engineering - Software & QARole Category: Software DevelopmentRole: Data EngineerEmployement Type: Full time