About the Role:
We are looking for a skilled and passionate GCP Data Engineer to design, develop, and optimize high-performance, scalable data pipelines. If you love working with Big Data, Python, and cutting-edge GCP tools, this role is for you!
What Youll Do:
Design, develop & maintain scalable ETL pipelines in GCP
Work with large datasets using Spark & optimize transformations
Develop automation & scripts in Python
Write & optimize complex SQL queries
Implement workflows in Apache Airflow
Leverage BigQuery for data warehousing & DataProc for distributed processing
Collaborate with analysts & scientists to meet data needs
Ensure data quality, integrity & security
What Were Looking For:
5+ years in Data Engineering
3+ years hands-on GCP experience
Strong Python & Spark skills
Solid SQL expertise
Experience with BigQuery, DataProc, Airflow
Strong knowledge of data modeling & ETL/ELT best practices

Keyskills: Data Engineering GCP Bigquery Python SQL Airflow Pyspark Data Flow Dataproc Google Cloud Platforms
Adastra is a global leader in data and analytics consulting, empowering organizations to become digital frontrunners since 2000. Our expertise spans across Artificial Intelligence, Cloud, Digital Transformation, and Data Governance, helping businesses make smarter, data-driven decisions. With a str...