We are seeking a skilled Data Engineer with 2-4 years of experience to design, build, and maintain scalable data pipelines and infrastructure. You will work with modern data technologies to enable data-driven decision making across the organisation.
Key ResponsibilitiesDesign and implement ETL/ELT pipelines using Apache Spark and orchestration tools (Airflow/Dagster). Build and optimize data models on Snowflake and cloud platforms. Collaborate with analytics teams to deliver reliable data for reporting and ML initiatives. Monitor pipeline performance, troubleshoot data quality issues, and implement testing frameworks. Contribute to data architecture decisions and work with cross-functional teams to deliver quality data solutions.
Required Skills & ExperienceTechnical Environment

Keyskills: DevOps Airflow data lake architectures Azure dimensional modeling Grafana Apache Spark SQL DataDog performance optimisation data modeling data warehouse concepts GCP Snowflake Dagster cost management ETL AWS Python