Design, develop, test, deploy and maintain large-scale data pipelines using Azure Data Factory (ADF) to integrate various data sources into a centralized repository.
Collaborate with cross-functional teams to gather requirements for data processing needs and design solutions that meet business objectives.
Develop complex SQL queries to extract insights from large datasets stored in Azure Blob Storage or other cloud platforms.
Troubleshoot issues related to ADF pipeline failures, data quality problems, and performance optimization.
Job Requirements :
5-9 years of experience in designing and developing large-scale data pipelines using ADF or similar tools like Informatica PowerCenter or Talend.
Strong understanding of Azure services such as Azure Data Bricks, Azure Blob Storage, and Azure App Services.
Proficiency in writing complex SQL queries for extracting insights from large datasets.
Experience working on Agile projects with Scrum methodologies.
Job Classification
Industry: InternetFunctional Area / Department: Data Science & AnalyticsRole Category: Data Science & Analytics - OtherRole: Data Science & Analytics - OtherEmployement Type: Full time