Job Description:
Role: GCP Data Engineer
Experience: 5 to 10 years
Location: Remote
Experience: 5 - 10 years Overview: Were hiring a Senior GCP Data Engineer to lead the design, build, and optimization of cloud-native data pipelines for enterprise data warehouse modernization. Youll work on end-to-end implementations including orchestration, CI/CD, real-time processing, and cost optimization. Key Responsibilities: - Architect and develop pipelines using GCP-native services. - Lead migration from on-prem to BigQuery. - Drive optimization via partitioning, clustering. - Design modular Composer DAGs. - Mentor engineers and ensure code quality. - Contribute to validation frameworks and monitoring. Required Skills: - 5+ years data engineering; 3+ in GCP. - Expert in Python, SQL, BigQuery, Dataflow, Composer. - Familiar with Terraform, Cloud Build, Git. Preferred Qualifications: - GCP Professional Data Engineer certification. - Bachelors or Masters in related field.
Apply Now: sh***************y@ce***************g.com

Keyskills: Bigquery GCP data Engineer Ci/Cd SQL Python GCP Data Flow
Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse o...