IMMEDIATE JOINERS AND NOTICE PERIOD WITH 30 DAYS ARE REQUIRED.
DUTIES AND RESPONSIBILITIES:
Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce, and AWS technologies
Monitoring active ETL jobs in production.
Build out data lineage artifacts to ensure all current and future systems are properly documented
Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes
Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies
Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations
Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs.
Hands-on experience in developing and implementing large-scale data warehouses, Business Intelligence and MDM solutions, including Data Lakes/Data Vaults.
QUALIFICATIONS:
Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years experience in business analytics, data science, software development, data modeling or data engineering work
3-7 years experience with a strong proficiency with SQL query/development skills
Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks
Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory,Fivetran)
Experience working in the healthcare industry with PHI/PII
Creative, lateral, and critical thinker
Excellent communicator
Well-developed interpersonal skills
Good at prioritizing tasks and time management
Ability to describe, create and implement new solutions
Experience with related or complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef)
Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau)
Big Data stack (e.g.Snowflake(Snowpark), SPARK, MapReduce, Hadoop, Sqoop, Pig, HBase, Hive, Flume)