Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Onix is Hiring GCP Data Warehousing Engineer @ Datametica

Home > Software Development

 Onix is Hiring GCP Data Warehousing Engineer

Job Description

We are seeking a highly skilled GCP Data Warehouse Engineer to join our data team. You will be responsible for designing, developing, and maintaining scalable and efficient data warehouse solutions on Google Cloud Platform (GCP). Your work will support analytics, reporting, and data science initiatives across the company.

Key Responsibilities:

  • Design, build, and maintain data warehouse solutions using BigQuery.
  • Develop robust and scalable ETL/ELT pipelines using Dataflow, Cloud Composer, or Cloud Functions.
  • Implement data modeling strategies (star schema, snowflake, etc.) to support reporting and analytics.
  • Ensure data quality, integrity, and security across all pipelines and storage.
  • Optimize BigQuery queries for performance and cost-efficiency (partitioning, clustering, materialized views).
  • Collaborate with data scientists, analysts, and other engineers to deliver high-quality datasets and insights.
  • Monitor pipeline performance and troubleshoot issues using Cloud Monitoring, Logging, and alerting tools.
  • Automate deployment and infrastructure using Terraform, Cloud Build, and CI/CD pipelines.
  • Stay up to date with GCPs evolving services and suggest improvements to our data infrastructure.

Required Skills & Qualifications:

  • Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience).
  • 3+ years of experience in data engineering or data warehousing roles.
  • Hands-on experience with BigQuery, Cloud Storage, Pub/Sub, and Dataflow.
  • Proficiency in SQL and Python (or Java/Scala).
  • Strong understanding of data modeling, data warehousing concepts, and distributed systems.
  • Experience with Cloud Composer (Airflow), version control (Git), and agile development.
  • Familiarity with IAM, VPC Service Controls, and other GCP security best practices.

Preferred Qualifications:

  • Google Cloud Professional Data Engineer certification.
  • Experience with Looker, Dataform, or similar BI/data modeling tools.
  • Experience working with real-time data pipelines or streaming data.
  • Knowledge of DevOps practices and infrastructure-as-code.

Why Join Us?

  • Work on cutting-edge cloud data architecture at scale.
  • Join a collaborative and fast-paced engineering culture.
  • Competitive salary, flexible work options, and career growth opportunities.
  • Access to learning resources, GCP credits, and certifications.

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Data Engineer
Employement Type: Full time

Contact Details:

Company: Datametica
Location(s): Hyderabad

+ View Contactajax loader


Keyskills:   gcp Data Warehousing sql DW Bigquery Data Flow Dataproc

 Job seems aged, it may have been expired!
 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Hiring For AXIOM developer resources in Mumbai

  • Clover Infotech
  • 4 - 7 years
  • Mumbai
  • 4 days ago
₹ 5-15 Lacs P.A.

Principal Applied AI Engineer

  • Zycus Infotech
  • 6 - 11 years
  • Pune
  • 4 days ago
₹ Not Disclosed

Python + DevOps Engineer

  • TekPillar
  • 4 - 8 years
  • Pune
  • 4 days ago
₹ -15 Lacs P.A.

Data Engineer

  • Tata Consultancy
  • 5 - 10 years
  • Bengaluru
  • 4 days ago
₹ Not Disclosed

Datametica

About Onix: Onix is a Google Cloud Premier Partner serving over 1,400 customers, including several of the worlds largest corporations, enabling them to effectively leverage Google Cloud Platform across industries and use cases. Onix specializes in Google Cloud Solutions such as Workload Migration...