4-5 years of experience in Data Engineering projects (Med to Small Data Lake / Data Warehousing preferred).
3+ years of experience in Python (Must Have) / Kafka programming (Preferred) / PySpark (Nice to Have) /
2+ years of working experience with AWS Services like S3 (Must Have), Lambda (Must Have), Redshift (Preferred), Glue (Nice to Have), Data Brew (Nice to Have)
Must Have:
3+ years of experience in performing data analysis, data ingestion and data integration. o 3+ years of experience in ETL (Extraction, Transformation & Loading) or ELT and architecting data systems.
3+ years of experience with schema design, data modelling and SQL queries. o Strong Database experience (Oracle/PostgreSQL/SQL Server/Redshift). o Design and Develop scalable Data warehouse solutions, ETL/ELT pipelines in AWS cloud environment.
Experience in data workflow management.
Knowledge on CI/CD (Must Have) and Terraform technologies (Nice to Have).
Preferred / Nice To Have:
Knowledge of advanced statistics and experience with statistical data analysis systems (scikit-learn, Pandas, R) is added advantage.
Education background:
BE / B Tech / MS in Computer Science, Information Technology, or equivalent degree in related discipline
Job Classification
Industry: IT Services & Consulting Functional Area / Department: IT & Information Security Role Category: IT & Information Security - Other Role: IT & Information Security - Other Employement Type: Full time