Develop scalable data collection, storage, and distribution platform to house data from vendors, research providers, exchanges, PBs, and web-scraping. Make data available to systematic & fundamental PMs, and enterprise functions: Ops, Risk, Trading, and Compliance. Develop internal data products and analytics
Web scraping using scripts/APIs/Tools
Help build and maintain greenfield data platform running on Snowflake and AWS
Understand the existing pipelines and enhance pipelines for the new requirements.
Onboarding new data providers
Data migration projects
Must have
10+ years of exp as Data Engineer
SQL
Python
Linux
Containerization(Docker, Kubernetes)
Good communication skills
AWS
Strong on Dev ops side of things(K8s, Docker, Jenkins)
Being ready to work in EU time zone
Nice to have
Market Data Projects/ Capital markets exp
Snowflake is a big plus
Airflow
Languages
English: B2 Upper Intermediate
Location - Pune,Bangalore,Hyderabad,Chennai,Noida

Keyskills: Data Engineering Airflow Docker Linux Snowflake AWS Kubernetes Python