1. 3+ years of relevant experience in Pyspark and Azure Databricks.
2. Proficiency in integrating, transforming, and consolidating data from various structured and unstructured data sources.
3. Good experience in SQL or native SQL query languages.
4. Strong experience in implementing Databricks notebooks using Python.
5. Good experience in Azure Data Factory, ADLS, Storage Services, Serverless architecture, Azure functions.
6. Experience in SSIS/ETL transformation processes.
7. Experience in Azure DevOps and CI/CD deployments
8. DP-203 certification
Essential Skills:
1. Create Databricks notebook to process data
2. Work on integrating and consolidating data from multiple sources and load to ADLS
3. Understanding the customer needs.
Location - MH / Bangalore, KA / Hyderabad, TL / Chennai, TN, Pune.

Keyskills: Azure Data Factory Pyspark Serverless architecture CI/CD Azure Databricks ETL Azure DevOps SQL Python
The Company is a specialist banking and financial industry IT solutions company with a decade of successful operations. We help creating new business models with our software products, solutions and specialized services. We are an ISO9001 and CMM Level 5 Company having a global foot print with opera...