Description:
Role summary
We are seeking a seasoned GCP Architect with 8-12 years of experience to lead the design, strategy, and implementation of our enterprise data platforms on Google Cloud. You will be responsible for defining the technical vision and architectural roadmap for our data ecosystem, with a strong emphasis on BigQuery and scalable analytics solutions. You will provide technical leadership to data engineering teams and collaborate with senior stakeholders across analytics, product, and engineering to drive our data-driven decision-making capabilities.
________________________________________
Key responsibilities
Architect and design highly scalable, resilient, and secure data platforms on GCP, defining the end-to-end vision for batch and streaming data processing.
Lead the strategy and optimization of our BigQuery environment, including logical data modeling, performance tuning, cost management, and governance.
Define and enforce best practices for data modeling (star/snowflake), data warehousing, and the creation of curated data layers (raw refined curated).
Oversee the integration of diverse data sources, including Cloud Storage, Cloud SQL, APIs, SaaS platforms, and event streams like Pub/Sub.
Establish the strategy for data orchestration and scheduling using tools like Cloud Composer (Airflow), Dataform, and Workflows.
Champion and implement robust data quality frameworks, testing methodologies, and automated data validation processes to ensure data integrity.
Drive cost and performance optimization strategies across the data platform, focusing on BigQuery slot management, query efficiency, and optimal resource allocation.
Design and implement comprehensive security and data governance frameworks using IAM, service accounts, DLP, row/column-level security, and policy tags.
Architect and implement solutions for monitoring, observability, and alerting for all data pipelines and platforms using Cloud Logging/Monitoring.
Lead CI/CD and Git-based workflows for infrastructure-as-code (IaC) and pipeline development, ensuring comprehensive documentation and runbooks.
________________________________________
Required qualifications
8-12 years of progressive experience in data engineering and architecture, with at least 5+ years in a senior or architectural role on the Google Cloud Platform.
Deep, authoritative expertise in BigQuery, including advanced query optimization, dataset design for large-scale enterprises, and cost management.
Exceptional SQL skills and strong proficiency in Python (or Java/Scala) for defining data strategy and guiding pipeline development.
Proven experience architecting solutions with at least one major orchestration tool (e.g., Composer/Airflow, Dataform).
Extensive experience designing and implementing GCP storage and data ingestion patterns (Cloud Storage, BigQuery load jobs, etc.).
Strong understanding of modern data formats (Parquet/Avro/JSON/CSV) and managing schema evolution in a complex environment.
Demonstrated experience leading Git and CI/CD best practices for data and infrastructure code.
________________________________________
Preferred qualifications
Hands-on experience architecting streaming data pipelines using Pub/Sub and Dataflow (Apache Beam).
Strong familiarity with Dataproc/Spark, Kafka, or CDC tools (e.g., Datastream).
Expertise in data governance and cataloging tools (Dataplex, Data Catalog), with a focus on data lineage and metadata management.
Proficiency with Terraform/IaC and experience designing secure environments with VPC-SC and advanced network security patterns.
Google Professional Cloud Architect or Google Professional Data Engineer certification is highly desirable.
________________________________________
Typical tech stack
BigQuery, Cloud Storage, Cloud Composer (Airflow) / Dataform, Dataflow, Pub/Sub, Cloud Logging/Monitoring, IAM, Terraform, Looker/Looker Studio.

Keyskills: Cloud Architecture Design Architecture Architectural Design Architecting