Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Data Engineer @ Optum

Home > DBA / Data warehousing

 Data Engineer

Job Description


Data Engineer Consultant APC383 - 27 (Individual Contributor)

Position Overview:
OHBI is seeking a highly skilled and experienced Data Engineer to join our team. The ideal candidate will have a strong background in programming, data management, and cloud infrastructure, with a focus on designing and implementing efficient data solutions. This role requires a minimum of 3 to 6 years of experience with a deep understanding of Azure services and infrastructure, ETL/ELT solutions, Snowflake, and artificial intelligence technologies. The candidate should have some knowledge of AI tools and applications to enhance workflows, automate tasks, and extract insights from data.

Key Responsibilities:

  • AI Integration and Development: Design, develop, and implement artificial intelligence systems, including the use of AI to create new software and AI-driven solutions. Apply AI-powered tools for data analysis, task automation, and decision-making.
  • Azure Infrastructure Management: Own and maintain all aspects of Azure infrastructure, recommending modifications to enhance reliability, availability, and scalability.
  • Security Management: Manage security aspects of Azure infrastructure, including network, firewall, private endpoints, encryption, PIM, and permissions management using Azure RBAC and Databricks roles.
  • Technical Troubleshooting: Diagnose and troubleshoot technical issues in a timely manner, identifying root causes and providing effective solutions.
  • Infrastructure as Code: Create and maintain Azure Infrastructure as Code using Terraform and GitHub Actions.
  • CI/CD Pipelines: Configure and maintain CI/CD pipelines using GitHub Actions for various Azure services such as ADF, Databricks, Storage, and Key Vault.
  • Programming Expertise: Utilize your expertise in programming languages such as Python to develop and maintain data engineering solutions.
  • Real-Time Data Streaming: Use Kafka for real-time data streaming and integration, ensuring efficient data flow and processing.
  • Data Management: Proficiency in Snowflake for data wrangling and management, optimizing data structures for analysis.
  • DBT Utilization: Build and maintain data marts and views using DBT, ensuring data is structured for optimal analysis.
  • ETL/ELT Solutions: Design ETL/ELT solutions using tools like Azure Data Factory and Azure Databricks, leveraging methodologies to acquire data from various structured or semi-structured source systems.
  • Communication: Strong communication skills to explain technical solutions and issues clearly in technical and non-technical terms ensuring understanding at the Engineering Lead (Delivery Owner) and Key Stakeholders (Business leaderships)

Qualifications:

  • Minimum of 3 to 6 years of experience in designing ETL/ELT solutions using tools like Azure Data Factory and Azure Databricks, Snowflake, Snowflake tasks and streams, Microsoft Data Fabric, Iceberg etc.
  • Knowledgeable in programming languages such as Python.
  • 1-2 years knowledge in LLMs, machine learning, data science, and programming languages.
  • 1-2 years knowledge using AI tools for data analysis, automation, and insight extraction from large datasets
  • Experience with Kafka for real-time data streaming and integration.
  • Proficiency in Snowflake for data wrangling and management.
  • Some level of understanding of dbt to build and maintain data marts and views.
  • In-depth understanding of managing security aspects of Azure infrastructure.
  • Experience in creating and maintaining Azure Infrastructure as Code using Terraform and GitHub Actions.
  • Ability to configure, set up, and maintain GitHub for various code repositories.
  • Experience in creating and configuring CI/CD pipelines using GitHub Actions for various Azure services.
  • Strong problem-solving skills and ability to diagnose and troubleshoot technical issues.

Excellent communication skills

Job Classification

Industry: Software Product
Functional Area / Department: Engineering - Software & QA
Role Category: DBA / Data warehousing
Role: Database Developer / Engineer
Employement Type: Full time

Contact Details:

Company: Optum
Location(s): Hyderabad

+ View Contactajax loader


Keyskills:   Snowflake ETL Data Bricks Github Aiml

 Fraud Alert to job seekers!

₹ 18-27.5 Lacs P.A

Similar positions

SAP Data and AI Architect

  • Capgemini
  • 20 - 25 years
  • Mumbai
  • 3 days ago
₹ Not Disclosed

Job Opening | Data Engineer- Python And Pyspark | Hcl Tech |

  • HCLTech
  • 5 - 10 years
  • Hyderabad
  • 3 days ago
₹ .75-8.75 Lacs P.A.

Data Engineer II

  • Amazon
  • 1 - 6 years
  • Hyderabad
  • 3 days ago
₹ Not Disclosed

Sap Basis & S/4 Hana Engineer For Sovereign Cloud India

  • SAP Servers Tech
  • 4 - 9 years
  • Bengaluru
  • 3 days ago
₹ Not Disclosed

Optum

About: OptumInsight India Pvt Ltd, a UnitedHealth group company is a leading health services and innovation company dedicated to help make the health system work better for everyone. With more than 115,000 people worldwide, Optum combines technology, data and expertise to improve the delivery, ...