The future is our choice At Atos, as the global leader in secure and decarbonized digital, our purpose is to help design the future of the information space. Together we bring the diversity of our people s skills and backgrounds to make the right choices with our clients, for our company and for our own futures.
Roles Responsibilities:
Establish technical designs to meet Sanofi requirements aligned with architectural and data standards
Optimize ETL/data pipelines balancing performance, functionality, and operational requirements
Fine-tune and optimize queries using Snowflake platform and database techniques
Manage data ingestion, transformation, processing, and orchestration of pipelines
Requirements:
Bachelors degree in computer science, engineering, or similar quantitative field
5+ years of relevant experience developing backend, integration, data pipelining, and infrastructure
Strong expertise in Python, PySpark, and Snowpark
Proven experience with Snowflake and AWS cloud platforms
Experience with Informatica/IICS for data integration
Expertise in database optimization and performance improvement
Experience with data warehousing and writing efficient SQL queries
Understanding of data structures and algorithms
Preferred
Knowledge of DevOps best practices and associated tools:
Containers and containerization technologies (Kubernetes, Argo, Red Hat OpenShift)
Infrastructure as code (Terraform)
CI/CD Pipelines (JFrog Artifactory)
Scripting and automation (Python, GitHub, GitHub actions)
Job Classification
Industry: IT Services & Consulting Functional Area / Department: Engineering - Software & QA Role Category: Software Development Role: Data Platform Engineer Employement Type: Full time