Data engineering architect to lead the design and development of modern Delta Lakehouse architecture on Azure, the role will focus on building scalable modular data platforms that support complex use cases such as customer 360 (Integrate data from multiple domains to create a unified view of the customer), real-time insights etc.
Mandatory Technical skills - Databricks, Azure, PySpark/Scala/SQL, Kafka, Python, ADF, SQL
Good to have - Knowledge on open-source frameworks - Apache Kafka, NiFi, Airflow, Apache Flink,dbt, Iceberg
Exp with data cataloging , metadata management, data quality, lineage, versioning, monitoring and DevOps practice
Data governance , privacy , and regulatory compliance GDPR, data privacy, PII,PHI etc

Keyskills: Pyspark Delta Lake Databricks