Airflow Data Engineer in AWS platform
Job Title Apache Airflow Data Engineer
ROLE as per TCS Role
Master 4-8 years of experience in AWS, Apache Airflow (on Astronomer platform), Python, Pyspark, SQL Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. Experience in creating data pipelines and orchestrating using Apache Airflow Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts. Good to have: Experience with cloud ETL and ELT in one of the tools like DBT/Glue/EMR or Matillion or any other ELT tool Excellent communication skills to liaise with Business & IT stakeholders. Expertise in planning execution of a project and efforts estimation. Exposure to working in Agile ways of working.
Candidate for this position
to be offered with TAIC or
TCSL as Entity Data warehousing , pyspark , Github, AWS data platform, Glue, EMR, RedShift, databricks,Data Marts. DBT/Glue/EMR or Matillion, data engineering, data modelling, data consumption
Keyskills: Airflow Pyspark ETL AWS Python Hive AWS Glue EMR ETL Tool Lambda Athena RedShift