Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Data Engineer @ Grid Dynamics

Home > IT Infrastructure Services

 Data Engineer

Job Description

We are seeking a skilled Data Engineer & Data Analyst with over 4 years of experience to design, build, and maintain scalable data pipelines and perform advanced data analysis to support business intelligence and data-driven decision-making. The ideal candidate will have a strong foundation in computer science principles, extensive experience with SQL and big data tools, and proficiency in cloud platforms and data visualization tools.

Key Responsibilities:

  • Design, develop, and maintain robust, scalable ETL pipelines using Apache Airflow, DBT, Composer (GCP), Control-M, Cron, Luigi, and similar tools.
  • Build and optimize data architectures including data lakes and data warehouses.
  • Integrate data from multiple sources ensuring data quality and consistency.
  • Collaborate with data scientists, analysts, and stakeholders to translate business requirements into technical solutions.
  • Analyze complex datasets to identify trends, generate actionable insights, and support decision-making.
  • Develop and maintain dashboards and reports using Tableau, Power BI, and Jupyter Notebooks for visualization and pipeline validation.
  • Manage and optimize relational and NoSQL databases such as MySQL, PostgreSQL, Oracle, MongoDB, and DynamoDB.
  • Work with big data tools and frameworks including Hadoop, Spark, Hive, Kafka, Informatica, Talend, SSIS, and Dataflow.
  • Utilize cloud data services and warehouses like AWS Glue, GCP Dataflow, Azure Data Factory, Snowflake, Redshift, and BigQuery.
  • Support CI/CD pipelines and DevOps workflows using Git, Docker, Terraform, and related tools.
  • Ensure data governance, security, and compliance standards are met.
  • Participate in Agile and DevOps processes to enhance data engineering workflows.

Required Qualifications:

  • 4+ years of professional experience in data engineering and data analysis roles.
  • Strong proficiency in SQL and experience with database management systems such as MySQL, PostgreSQL, Oracle, and MongoDB.
  • Hands-on experience with big data tools like Hadoop and Apache Spark.
  • Proficient in Python programming.
  • Experience with data visualization tools such as Tableau, Power BI, and Jupyter Notebooks.
  • Proven ability to design, build, and maintain scalable ETL pipelines using tools like Apache Airflow, DBT, Composer (GCP), Control-M, Cron, and Luigi.
  • Familiarity with data engineering tools including Hive, Kafka, Informatica, Talend, SSIS, and Dataflow.
  • Experience working with cloud data warehouses and services (Snowflake, Redshift, BigQuery, AWS Glue, GCP Dataflow, Azure Data Factory).
  • Understanding of data modeling concepts and data lake/data warehouse architectures.
  • Experience supporting CI/CD practices with Git, Docker, Terraform, and DevOps workflows.
  • Knowledge of both relational and NoSQL databases, including PostgreSQL, BigQuery, MongoDB, and DynamoDB.
  • Exposure to Agile and DevOps methodologies.
  • Experience with at least one cloud platform:
  • Google Cloud Platform (BigQuery, Dataflow, Composer, Cloud Storage, Pub/Sub)
  • Amazon Web Services (S3, Glue, Redshift, Lambda, Athena)
  • Microsoft Azure (Data Factory, Synapse Analytics, Blob Storage)

Preferred Skills:

  • Strong problem-solving and communication skills.
  • Ability to work independently and collaboratively in a team environment.
  • Experience with service development, REST APIs, and automation testing is a plus.
  • Familiarity with version control systems and workflow automation.

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: IT & Information Security
Role Category: IT Infrastructure Services
Role: IT Infrastructure Services - Other
Employement Type: Full time

Contact Details:

Company: Grid Dynamics
Location(s): Bengaluru

+ View Contactajax loader


Keyskills:   SQL Pyspark Azure Data Engineering Amazon Web Services GCP Hadoop Spark Google Cloud Platforms AWS Python

 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Python Automation Engineer

  • Hexaware Technologies
  • 4 - 9 years
  • Pune
  • 9 hours ago
₹ Not Disclosed

Intersystems IRIS data platform

  • First Meridian
  • 3 - 8 years
  • Hyderabad
  • 2 days ago
₹ Not Disclosed

Cloud Monitoring Engineer

  • Vikruthi Technologies
  • 1 - 4 years
  • Chennai
  • 2 days ago
₹ 2-5 Lacs P.A.

Civil Engineer

  • Tata Consultancy
  • 4 - 9 years
  • Noida, Gurugram
  • 4 days ago
₹ Not Disclosed

Grid Dynamics

NextSphere is full-service custom application development firm that helps customers grow and keep up, in a constantly changing technology landscape. We at NextSphere develop and support business applications for customers in wide range of industries. We strive to work on projects where the NextSpher...