Job Description
Experience 4 to 7 years.
Experience in any ETL tools [e.g. DataStage] with implementation experience in large Data Warehouse
Proficiency in programming languages such as Python etc.
Experience with data warehousing solutions (e.g., Snowflake, Redshift) and big data technologies (e.g., Hadoop, Spark).
Strong knowledge of SQL and database management systems.
Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and data pipeline orchestration tools (e.g. Airflow).
Proven ability to lead and develop high-performing teams, with excellent communication and interpersonal skills.
Strong analytical and problem-solving abilities, with a focus on delivering actionable insights.
Responsibilities
Design, develop, and maintain advanced data pipelines and ETL processes using niche technologies.
Collaborate with cross-functional teams to understand complex data requirements and deliver tailored solutions.
Ensure data quality and integrity by implementing robust data validation and monitoring processes.
Optimize data systems for performance, scalability, and reliability.
Develop comprehensive documentation for data engineering processes and systems.
Keyskills: ETL SQL Python Azure Datastage Snowflake Ab Initio Informatica Teradata AWS
About: OptumInsight India Pvt Ltd, a UnitedHealth group company is a leading health services and innovation company dedicated to help make the health system work better for everyone. With more than 115,000 people worldwide, Optum combines technology, data and expertise to improve the delivery, ...