Bachelors/Masters in Computer Science or related field, or equivalent experience
- Minimum of 8+ years experience working in Data Engineering, primarily on GCP skill sets
- Hands on experience developing Data engineering pipelines using Workflows Experience in troubleshooting performance issues Minimum of 3+ years of experience working in Data engineering, Big data platform and technologies including Hadoop, Hive, Sqoop, Spark, Kafka, Big Query, Terraform, Tekton, Astro, Astonomer, Data Flow, Data Proc, Data Fusion, etc
- , Design data pipelines and data robots, take a vision and bring it to life Master data engineer; teaches others; works closely with IT architects Experience programming in Java and Python is a plus Experience working in Agile/XP Strong analytical and problem-solving skills Strong oral and written communication skills
GCP Skill set is mandatory. (Big Query, Python, SQL Server, Terraform, Tekton, Astro, Astronomer, DBT),
REQUIREMENTS MANAGEMENT: Identify, document, communicate and design per requirements.
DESIGN: Work with Data Product Owners to design data stores.
IMPLEMENT DATA STORES: Implement data stores on GCP, using DBT for transformations, BigQuery for data store and Astronomer for overall orchestration.
TEST: Participate in testing and adopt test driven development.
TUNE: Tune data stores (indexes, SQL queries) to improve performance.
L2 SUPPORT: Assist with customer inquiries and incidents/problems.
COLLABORATE: Work within/across teams.