Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Data Engineering Consultant @ Optum

Home > DBA / Data warehousing

 Data Engineering Consultant

Job Description

2262134

Description - Internal

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together

Functions may include database architecture, engineering, design, optimization, security, and administration; as well as data modeling, big data development, Extract, Transform, and Load (ETL) development, storage engineering, data warehousing, data provisioning and other similar roles. Responsibilities may include Platform-as-a-Service and Cloud solution with a focus on data stores and associated eco systems. Duties may include management of design services, providing sizing and configuration assistance, ensuring strict data quality, and performing needs assessments. Analyzes current business practices, processes and procedures as well as identifying future business opportunities for leveraging data storage and retrieval system capabilities. Manages relationships with software and hardware vendors to understand the potential architectural impact of different vendor strategies and data acquisition. May design schemas, write SQL or other data markup scripting and helps to support development of Analytics and Applications that build on top of data. Selects, develops and evaluates personnel to ensure the efficient operation of the function. Generally work is self-directed and not prescribed.

Primary Responsibilities:

  • Understand Caredata architecture/domain and start contributing for new/existing business requests
  • Experience in Data Integration and Data Warehousing, Cloud
  • Develop efficient and high performing ETL solutions
  • Design, develop, test and performance tune complex ETLs
  • Develops and maintains scalable data pipelines in response to customer requirements
  • Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues
  • Design, develop, implement, and run data solutions that improve data efficiency, reliability, and quality
  • Participate in design review sessions and peer ETL reviews
  • Assess and profile source and target data (data structure, quality, completeness, schema, nulls, etc.) and requested business use cases
  • Summarizes testing and validation results and can communicate and make recommendations/decisions on the best course of action to remediate
  • Resourceful at coming up with solutions using existing or available resources based on knowledge of the organization and level of execution effort
  • Participate in agile work environment, attend daily scrums, and complete sprint deliverables on time
  • Supports practices, policies and operating procedures and ensures alignment to departmental objectives and strategy
  • Ensure the code is meeting the desired quality checks using Sonar
  • To make sure all the cloud infra is intact and resolve any issues encountered
  • Schedule the deployed pipelines using Airflow following proper dependency hierarchy of jobs
  • Move the code/application to higher environments(stage/prod) from non-prod for go-live
  • Build and maintain pipelines and automation through Git Ops (GitHub Actions, Jenkins, etc)
  • Identifies solutions to non-standard complex requests and problems and creates solution using available technologies
  • Builds solid relationship with IT operational leaders to ensure connectivity to the business
  • Supports a work environment in which people can perform to the best of their abilities. Holds self-accountable for technical abilities, productive results, and leadership attributes
  • Works with less structured, more complex issues
  • Serves as a resource to others
  • Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so

Qualifications - Internal

Required Qualifications:

  • Bachelor or masters degree in computer science or information technology or equivalent
  • 10+ years of experience in designing and developing ETL solutions
  • 10+ years of working knowledge in a Data Warehouse/BI environment
  • Solid DBMS experience: Snowflake, SQL, Hive
  • Good experience in job schedulers like Airflow
  • Windows batch PowerShell scripting and/or UNIX shell scripting and Python experience
  • Good experience in working on cloud environment, preferably Azure
  • Experience with Continuous Integration CI/CD pipelines using Jenkins, GitHub Actions
  • Experience with contemporary SDLC methodologies such as Agile, Scrum
  • Expertise in BigData frameworks like Spark and good understanding of Hadoop concepts
  • Hands on experience with Rally
  • Solid ETL skills using Bigdata (Databricks, Spark, Scala, Python), Kafka and Azure Cloud / AWS
  • Solid SQL skills including complex SQL constructs, DDL generation
  • Proven excellent organizational, analytical, writing, problem solving and interpersonal skills

Preferred Qualifications:

  • Good understanding of US healthcare
  • Must Have Skill: ETL solutions
  • Data Warehouse/BI environment
  • Bigdata (Databricks, Spark, Scala, Python
  • Azure Cloud / AWS
  • SQL

NP: 30-90 Days

10+ Years of work Exp.

Job Classification

Industry: Analytics / KPO / Research
Functional Area / Department: Engineering - Software & QA
Role Category: DBA / Data warehousing
Role: Data warehouse Architect / Consultant
Employement Type: Full time

Contact Details:

Company: Optum
Location(s): Noida, Gurugram

+ View Contactajax loader


Keyskills:   Azure Cloud Data Warehousing ETL Python SQL Big Data AWS

 Job seems aged, it may have been expired!
 Fraud Alert to job seekers!

₹ 2.5-40 Lacs P.A

Similar positions

Data Architect

  • Accenture
  • 7 - 11 years
  • Hyderabad
  • 2 days ago
₹ Not Disclosed

Data Architect

  • Accenture
  • 7 - 11 years
  • Hyderabad
  • 2 days ago
₹ Not Disclosed

Data Architect

  • Accenture
  • 7 - 11 years
  • Bengaluru
  • 2 days ago
₹ Not Disclosed

Data Architect

  • Accenture
  • 7 - 11 years
  • Hyderabad
  • 2 days ago
₹ Not Disclosed

Optum

About: OptumInsight India Pvt Ltd, a UnitedHealth group company is a leading health services and innovation company dedicated to help make the health system work better for everyone. With more than 115,000 people worldwide, Optum combines technology, data and expertise to improve the delivery, ...