Job Description:
Role: GCP Data Engineer
Experience: 8+ Years
Location: Chennai, Bengaluru, Hyderabad, Noida
Notice Period: Immediate to 15 days (Serving notice period preferred)
GCP Data Engineer SQL, Python, Google Cloud, BigQuery,Dataflow (Apache Beam), Cloud Composer (Apache Airflow), Cloud Data Fusion, * SQL, DBT , Nice to have * SQL Server/SSIS
Role Description Must Have Skills It would be great if profile also has
* Building reusable data pipelines at
scale, work with structured and
unstructured data, and feature
engineering for machine learning or
curate data to provide real time
contextualise insights to power our
customers journ1eys.
* Using industry leading toolkits, as
well as evaluating exciting new
technologies to design and build scalable
real time data applications.
* Spanning the full data lifecycle and
experience using mix of modern and
traditional data platforms (Kafka,
GCP, SQL server) you'll get to work
building capabilities with horizon
expanding exposure to a host of wider
technologies and careers in data.
* Helping in adopting best engineering
practices like Test Driven
Development, code reviews,
Continuous Integration/Continuous
Delivery etc for data pipelines.
* Mentoring other engineers to deliver
high quality and data led solutions for our
Bank's customers SQL, Python, Google Cloud (GCP), BigQuery,Dataflow (Apache Beam), Cloud Composer (Apache Airflow), Cloud Data Fusion, DBT Question asked: explain your project. have worked on any data modelling? medallian data modelling Normalise and denormalise database big query- partition and clustering pubsub and dataflow data fusion DBT tool? ssis and sql server? GCP composer (Airflow) Dataflow( Apache beam)
Apply Now:sh***************y@ce***************g.com

Keyskills: Bigquery Data Flow SQL Server Google Cloud Platforms Python Kafka GCP data engineer SQL
\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\n\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\nMNC