Should be able to understand how data is stored (schema/FileShare) in source and document it.
Should be able to estimate work packages, based on WBS or any other technique and document it.
Should be able to create new ADF instances and install IR on cloud & on-prem machines and connect it.
Should be able to develop new/enhance existing/test and deploy pipelines/linked services/data sets/etc as per business need and document it.
Should be able to monitoring existing flows, troubleshoot issues, and make knowledge database.
Good to have Azure development hands-on (MR, Spark, etc) experience.
Should have worked on ADLS, Blob, IR, Vnet, Subnets, SQL DB, Service Principals, etc.
Should be able to connect ADF with source and document it
Should be able to propose/implement ADF best practices and document it
Should have good communication skills
Should be able to write PowerShell and UNIX scripts
Good to have knowledge on ETL tool
Good to have understanding on any ticketing tool
Should be able to monitor and take action on failed ADF pipelines.
Should be able to monitor ADF and IR (integration Runtimes) and take action in case of issue
Good to have understanding of networking, firewall, Windows and UNIX
Should be willing to work in afternoon/night shifts and provide on-call support in weekends/weekdays
Employement Category:
Employement Type: Full time Industry: Engineering / Construction Role Category: Application Programming / Maintenance Functional Area: Not Applicable Role/Responsibilies: Azure Data Factory
Contact Details:
Company Name: Talent Corner Location(s): Bengaluru