Design, develop, and optimize complex data pipelines and transformation processes using Snowflake, dbt, and AWS services.
Implement and manage data integration workflows using Fivetran to ensure timely and accurate data ingestion from various sources.
Develop and maintain scalable data models and schemas in Snowflake, ensuring they meet performance and business requirements.
Monitor and fine-tune the performance of data pipelines, queries, and data models to ensure optimal efficiency and cost-effectiveness.
Utilize Snowflakes features, such as Time Travel, Zero-Copy Cloning, and Data Sharing, to enhance data management and performance.
Leverage AWS services, such as AWS Lambda, S3, and Glue, to build and manage serverless data processing workflows and data storage solutions.
Implement data security measures and ensure compliance with data privacy regulations and organizational policies.
Troubleshoot and resolve complex data issues, including data sync errors, performance bottlenecks, and integration challenges. Provide support for data-related incidents and ensure effective resolution of production issues.
Collaborate with data analysts, and other stakeholders to understand data needs and deliver effective solutions.
Document data processes, models, and workflows, ensuring clear communication and knowledge sharing across teams. Independently assess situations, apply sound judgment and discretion, and make decisions on matters of significant impact without direct supervision
Industry: IT Services & ConsultingFunctional Area / Department: Engineering - Software & QARole Category: DBA / Data warehousingRole: Data warehouse Architect / ConsultantEmployement Type: Full time