Location : Chennai, Bangalore, Hyderabad ,Noida
Job Description
GCP Data Engineer SQL, Python, Google Cloud, BigQuery,Dataflow (Apache Beam), Cloud Composer (Apache Airflow), Cloud Data Fusion, * SQL, DBT , Nice to have * SQL Server/SSIS
Role Description Must Have Skills It would be great if profile also has
* Building reusable data pipelines at
scale, work with structured and
unstructured data, and feature
engineering for machine learning or
curate data to provide real time
contextualise insights to power our
customers journ1eys.
* Using industry leading toolkits, as
well as evaluating exciting new
technologies to design and build scalable
real time data applications.
* Spanning the full data lifecycle and
experience using mix of modern and
traditional data platforms (Kafka,
GCP, SQL server) you'll get to work
building capabilities with horizon
expanding exposure to a host of wider
technologies and careers in data.
* Helping in adopting best engineering
practices like Test Driven
Development, code reviews,
Continuous Integration/Continuous
Delivery etc for data pipelines.
* Mentoring other engineers to deliver
high quality and data led solutions for our
Bank's customers SQL, Python, Google Cloud (GCP), BigQuery,Dataflow (Apache Beam), Cloud Composer (Apache Airflow), Cloud Data Fusion, DBT Question asked: explain your project. have worked on any data modelling? medallian data modelling Normalise and denormalise database big query- partition and clustering pubsub and dataflow data fusion DBT tool? ssis and sql server? GCP composer (Airflow) Dataflow( Apache beam)

Keyskills: Google Cloud -GCP Dataflow -Apache Beam Cloud Composer -Apache Airflow BigQuery Cloud Data Fusion DBT SQL Python
Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse o...