GCP- Google Big Query - Strong experience in data engineering or analytics with strong SQL expertise. hands-on experience with Google BigQuery in production environments.
Strong understanding of BigQuery architecture, partitioning, clustering, and performance tuning.
Experience with GCP data services such as Cloud Storage, Dataflow, Composer (Airflow), and Pub/Sub.
Proficiency in data modeling techniques (star/snowflake schema, denormalization, etc.).
Familiarity with scripting languages such as Python or Java for orchestration and transformation.
Experience with CI/CD tools and version control (e.g., Git, Cloud Build). Solid understanding of data security and access control within GCP. Design, develop, and maintain scalable data pipelines using BigQuery and GCP-native tools.
Optimize complex SQL queries and BigQuery jobs for performance and cost efficiency. Collaborate with business analysts, data scientists, and engineers to deliver actionable insights from large datasets. Build and manage data warehouses and data marts using BigQuery.
Integrate BigQuery with other GCP services such as Cloud Storage, Dataflow, Pub/Sub, and Cloud Functions. Implement best practices for data modeling, data governance, and security within BigQuery.
Monitor and troubleshoot data workflows and optimize storage/query performance. Participate in architecture discussions and contribute to overall data platform strategy.
Job Classification
Industry: IT Services & ConsultingFunctional Area / Department: Engineering - Software & QARole Category: DBA / Data warehousingRole: Data warehouse DeveloperEmployement Type: Full time