Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Data Engineer - Python/Consultant Specialist @ Hsbc

Home > Software Development

 Data Engineer - Python/Consultant Specialist

Job Description





Some careers shine brighter than others.

If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.

HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.

We are currently seeking an experienced professional to join our team in the role of Consultant Specialist.

In this role, you will:

  • Design, develop, and optimize data pipelines using Azure Databricks, PySpark, and Prophesy.
  • Implement and maintain ETL/ELT pipelines using Azure Data Factory (ADF) and Apache Airflow for orchestration.
  • Develop and optimize complex SQL queries and Python-based data transformation logic.
  • Work with version control systems (GitHub, Azure DevOps) to manage code and deployment processes.
  • Automate deployment of data pipelines using CI/CD practices in Azure DevOps.
  • Ensure data quality, security, and compliance with best practices.
  • Monitor and troubleshoot performance issues in data pipelines.
  • Collaborate with cross-functional teams to define data requirements and strategies.










Requirements



To be successful in this role, you should meet the following requirements:

  • 5+ years of experience in data engineering, working with Azure Databricks, PySpark, and SQL.
  • Hands-on experience with Prophesy for data pipeline development.
  • Proficiency in Python for data processing and transformation.
  • Experience with Apache Airflow for workflow orchestration.
  • Strong expertise in Azure Data Factory (ADF) for building and managing ETL processes.
  • Familiarity with GitHub and Azure DevOps for version control and CI/CD automation.
  • Solid understanding of data modelling, warehousing, and performance optimization.
  • Ability to work in an agile environment and manage multiple priorities effectively.
  • Excellent problem-solving skills and attention to detail.
  • Experience with Delta Lake and Lakehouse architecture.
  • Hands-on experience with Terraform or Infrastructure as Code (IaC).
  • Understanding of machine learning workflows in a data engineering context.

Job Classification

Industry: Consumer Electronics & Appliances
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Data Engineer
Employement Type: Full time

Contact Details:

Company: Hsbc
Location(s): Hyderabad

+ View Contactajax loader


Keyskills:   Automation Version control Machine learning Agile Workflow Data quality Apache Financial services SQL Python

 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Principal Software Development Engineer - SaaS Continuity Engineering

  • Oracle
  • 8 - 13 years
  • Kolkata
  • 14 hours ago
₹ Not Disclosed

Senior Staff Software Engineer, AI Data Trust

  • Google
  • 2 - 4 years
  • Bengaluru
  • 18 hours ago
₹ Not Disclosed

Cloud Engineer, AI

  • Google
  • 1 - 4 years
  • Noida, Gurugram
  • 23 hours ago
₹ Not Disclosed

Lead Software Engineer

  • Capgemini
  • 5 - 8 years
  • Hyderabad
  • 1 day ago
₹ Not Disclosed

Hsbc

Canara HSBC Life Insurance Company