5-8 years of experience
Design, develop, and maintain scalable data pipelines and ETL processes.
Implement data solutions using Snowflake for data warehousing and analytics.
Write efficient, reusable, and maintainable Python code for data processing.
Collaborate with data scientists, analysts, and other stakeholders to understand data needs and deliver solutions.
Optimize and troubleshoot data workflows to ensure data quality and reliability.
Develop and maintain documentation related to data processes and pipelines.
Having good experience in understanding and writing complex SQL queries.

Keyskills: data warehousing etl snowflake python sql queries hive amazon redshift pyspark sql data modeling spark etl tool hadoop big data oracle data analysis data processing airflow data engineering data quality tableau aws informatica unix etl process
Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse o...