Design and develop ETL pipelines in Matillion and Snowflake, ensuring high-performance and reliability for data transfer and transformation.
Work with stakeholders to understand data requirements and develop ETL processes to meet those requirements.
Troubleshoot issues related to data loading, transformation, and processing.
Optimize existing ETL processes and implement best practices for data processing.
Monitor and maintain the data warehouse environment, ensuring uptime and data integrity.
Assist in creating and maintaining technical documentation.
Work with stakeholders to answer queries related to data warehouse, ETL, and reporting.
Stay up-to-date with the latest technologies and trends in the data engineering space.
6 to 10 years of experience with ETL and data warehouse tools such as Matillion and Snowflake.
Proficiency in SQL and Python.
Experience with Apache Airflow and/or other workflow management tools.
Experience with BI tools such as Tableau and Power BI.
Strong problem-solving and analytical skills.
Excellent written and verbal communication skills.
Ability to work independently and in a team environment.
Location : - Delhi, Haryana, Kochi, Bhubaneshwar, Mysore, Kolkatta|
Keyskills: Snowflake python data processing airflow power bi etl data engineering sql