Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Gcp + Big Query + Python / Pyspark + Etl Professional @ Virtusa

Home > Software Development

 Gcp + Big Query + Python / Pyspark + Etl Professional

Job Description


Job description
Design and develop robust ETL pipelines using Python, PySpark, and GCP services.Build and optimize data models and queries in BigQuery for analytics and reporting.Ingest, transform, and load structured and semi-structured data from various sources.Collaborate with data analysts, scientists, and business teams to understand data requirements.Ensure data quality, integrity, and security across cloud-based data platforms.Monitor and troubleshoot data workflows and performance issues.Automate data validation and transformation processes using scripting and orchestration tools.Required Skills & Qualifications:Hands-on experience with Google Cloud Platform (GCP), especially BigQuery.Strong programming skills in Python and/or PySpark.Experience in designing and implementing ETL workflows and data pipelines.Proficiency in SQL and data modeling for analytics.Familiarity with GCP services such as Cloud Storage, Dataflow, Pub/Sub, and Composer.Understanding of data governance, security, and compliance in cloud environments.Experience with version control (Git) and agile development practices. Qualification Design and develop robust ETL pipelines using Python, PySpark, and GCP services.Build and optimize data models and queries in BigQuery for analytics and reporting.Ingest, transform, and load structured and semi-structured data from various sources.Collaborate with data analysts, scientists, and business teams to understand data requirements.Ensure data quality, integrity, and security across cloud-based data platforms.Monitor and troubleshoot data workflows and performance issues.Automate data validation and transformation processes using scripting and orchestration tools.Required Skills & Qualifications:Hands-on experience with Google Cloud Platform (GCP), especially BigQuery.Strong programming skills in Python and/or PySpark.Experience in designing and implementing ETL workflows and data pipelines.Proficiency in SQL and data modeling for analytics.Familiarity with GCP services such as Cloud Storage, Dataflow, Pub/Sub, and Composer.Understanding of data governance, security, and compliance in cloud environments.Experience with version control (Git) and agile development practices.

Job Classification

Industry: Banking
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Data Platform Engineer
Employement Type: Full time

Contact Details:

Company: Virtusa
Location(s): Hyderabad

+ View Contactajax loader


Keyskills:   python pyspark sql data modeling bigquery data validation version control microsoft azure cloud platform job description microservices spring boot etl pipelines git java gcp spark cloud storage data governance agile etl aws process transformation

 Job seems aged, it may have been expired!
 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Amazon Connect - Developer / Senior Developer

  • Cognizant
  • 8 - 12 years
  • Chennai
  • 4 days ago
₹ Not Disclosed

Openstack SE Ops professional

  • Capgemini
  • 3 - 7 years
  • Noida, Gurugram
  • 4 days ago
₹ Not Disclosed

MREF (Tririga) professional

  • Capgemini
  • 3 - 6 years
  • Bengaluru
  • 4 days ago
₹ Not Disclosed

Power Paltform Professional

  • Capgemini
  • 6 - 10 years
  • Hyderabad
  • 8 days ago
₹ Not Disclosed

Virtusa