Skills Required:
4 to 8 years of total IT experience with 2+ years in big data engineering and Microsoft Azure
Experience in implementing Data Lake with technologies like Azure Data Factory (ADF), PySpark, Databricks, ADLS, Azure SQL Database
A comprehensive foundation with working knowledge of Azure Full stack, Event Hub & Streaming Analytics.
A passion for writing high-quality code and the code should be modular, scalable, and free of bugs (debugging skills in SQL, Python, or Scala/Java).
Enthuse to collaborate with various stakeholders across the organization and take complete ownership of deliverables.
Experience in using big data technologies like Hadoop, Spark, Databricks, Airflow, Kafka.
Adept understanding of different file formats like Delta Lake, Avro, Parquet, JSON, and CSV.
Good knowledge of building and designing REST APIs with real-time experience working on Data Lake/Lakehouse projects.
Experience in supporting BI and Data Science teams in consuming the data in a secure and governed manner.
Certifications like Data Engineering on Microsoft Azure (DP-203) or Databricks Certified Developer (DE) are a valuable addition.
Keyskills: hadoop azure airflow kafka spark microsoft databricks
KUKULKAN is recruitment and staffing solutions consultant based at Madurai, Tamilnadu. We are experts in this field with 13 years of experience, focusing on Southern India.