We are looking for an experienced GCP Data Engineer with over 5 years of hands-on experience. The ideal candidate will work with Google Cloud Platform (GCP), BigQuery, Python, DBT, Terraform, and GIT.
Design, develop, and maintain scalable data pipelines on GCP.
Implement and optimize data storage solutions with BigQuery for large-scale processing.
Develop, test, and deploy data transformation workflows using Python.
Collaborate with data scientists, analysts, and stakeholders to meet data requirements.
Ensure data quality and integrity throughout the data lifecycle.
Implement CI/CD pipelines for data applications and manage infrastructure using Terraform.
Utilize DBT for data modelling, testing, and documentation.
Use GIT for version control and code collaboration.
Monitor and troubleshoot data pipelines to ensure reliability and performance.
Stay updated with industry trends and best practices in data engineering and GCP services.
Day to day responsibilities
As part of an Agile product team, day-to-day you will:
Take part in our daily stand-ups.
Contribute to ceremonies like steering, story writing, collaborative design and retrospectives.
Develop new features and improve code quality by pair programming with other team members.
Take part in the support and monitoring of our services.
Interact with various stake holders where required to deliver quality products.
About You
Over 5 years of experience in data/software engineering on a cloud platform (AWS/GCP/Azure) using tools such as DBT and programming languages such as Python, Scala or Java.
You have strong SQL and data problem-solving skills.
Experience with data modelling and transformation tools like DBT.
Possess a solid understanding of modern data engineering practices.
You factor in non-functional aspects of data pipeline development, including quality checks, cost-effectiveness, sensitive data handling, usage monitoring, and observability of data pipelines and data quality.
You promote working in a cross-functional, collaborative team where there is collective code ownership.
You understand how your teams work can impact interdependent teams and design accordingly.
You are comfortable with making large-scale refactoring of a codebase.
You can facilitate and guide technical discussions to a workable outcome.
You enjoy mentoring team members and act as a role model on the team.
You understand distributed systems concepts and are familiar with the pros and cons of common data architectures, including data meshes
Good to Have:
Expertise in GCP & BigQuery and large-scale data processing.
Strong Python programming skills.
Experience with data modelling and transformation tools like DBT.
Familiarity with infrastructure-as-code tools like Terraform.
Proficiency with GIT for version control.
Strong problem-solving skills and attention to detail.
Excellent communication and teamwork abilities.
GCP Certification (e.g., Professional Data Engineer).
Experience with other GCP services (e.g., Cloud Storage, Cloud Composer, Dataflow).
Knowledge of data governance and security best practices.
FLEXI Careers India, an AVTAR enterprise, works extensively in the area of Diversity & Inclusion advocacy. With a keen focus on gender diversity to create sustainable careers for women, FLEXI assists discerning organizations in making their workplaces more welcoming for women. FLEXI also provides r...