Position: Snowflake Data Engineer
Work Location: Hyderabad(WFO)
Experience : 3+
About the opportunity :
We are seeking a highly skilled and experienced Snowflake Developer with a strong background in SQL, Python, and a minimum of 3 years of hands-on experience with Snowflake. The ideal candidate will be a Snowflake Certified with a proven track record in data warehousing, data modelling, and implementing ETL/ELT pipelines using industry-standard tools.
Primary Roles & Responsibilities
Design, develop, and optimize data pipelines and ETL/ELT processes in Snowflake.
Develop and optimize complex SQL queries for data extraction, transformation, and reporting.
Write robust Python scripts for automation, orchestration, and data transformations.
Migrate data from legacy systems to Snowflake and integrate various data sources.
Implement Snowflake best practices for performance tuning, security, and cost management.
Collaborate with cross-functional teams to implement end-to-end data warehouse solutions.
Required Skills
Minimum 3 years of hands-on experience with Snowflake.
Strong expertise in SQL development and optimization.
Proficient in Python for scripting and data engineering tasks.
Experience in Data Warehouse architecture and Data Modeling (Star/Snowflake Schema).
Hands-on experience with ETL/ELT tools like Informatica, Matillion, dbt, Talend, or equivalent.
Experience with cloud platforms (AWS, Azure, or GCP) and associated services.
Solid understanding of performance tuning, data governance, and security concepts.
Excellent problem-solving and communication skills.

Keyskills: Etl Pipelines Snowflake Data Ingestion Data Pipeline Data Warehousing SQL Python Data Mapping
Kasmo is a fast growing end-to-end IT service provider of innovative solutions across various domains worldwide to achieve business transformation by leveraging on its industry insight and technology expertise.