Roles & Responsibilities:
Designing, developing and deploying cloud-based data platforms using Google Cloud Platform (GCP)
Integrating and processing large amounts of structured and unstructured data from various sources
Implementing and optimizing ETL processes and data pipelines
Developing and maintaining security and access controls
Collaborating with other teams to ensure the consistency and integrity of data
Troubleshooting and resolving data platform issues
Skills Requirements:
In-depth knowledge of GCP services and tools such as Google Cloud Storage, Google BigQuery, and Google Cloud Dataflow
Experience in building scalable and reliable data pipelines using GCP services, Apache Beam, and related big data technologies
Familiarity with cloud-based infrastructure and deployment, specifically on GCP
Strong knowledge of programming languages such as Python, Java, and SQL
Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner.
Must understand the company's long-term vision and align with it.
Should be open to new ideas and be willing to learn and develop new skills.
Should also be able to work well under pressure and manage multiple tasks and priorities.
7-9 years of work experience in relevant field
B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Keyskills: gcp spark bigquery python aws hive beam scala big data technologies apache pig sql java apache data science hadoop etl big data product engineering hbase microsoft azure google data engineering google analytics sqoop etl process
If youre thinking scale, think bigger and dont stop there. At Walmart Global Tech India, we dont just innovate, we enable transformations across stores and different channels for the Walmart experience. \\\\\\\\r\\\\\\\\n \\\\\\\\r\\\\\\\\nA regular day at Walmart Global Tech India means using tech...