Data Engineering Manager Technical Manager
8 - 13 years
Hyderabad
Full-Time
Tezo is a new generation Digital AI solutions provider, with a history of creating remarkable outcomes for our customers. We bring exceptional experiences using cutting-edge analytics, data proficiency, technology, and digital excellence.
Seeking a highly skilled and modern Data Engineering Manager to lead technical teams in architecting and delivering cutting-edge data solutions across multiple cloud platforms. This role requires deep expertise in AWS, Azure, GCP, Snowflake, and Databricks , along with a strong background in data engineering, architecture, and analytics.
As a Technical Manager , you will drive end-to-end data solutioning , oversee data pipeline development , and ensure scalability, performance, and security while aligning solutions with business objectives.
Key Responsibilities:
- Solution Architecture: Design and implement modern, scalable, and high-performance data architectures across cloud platforms (AWS, Azure, GCP).
- Data Engineering Integration: Develop, optimize, and manage ETL/ELT pipelines , data lakes, and real-time streaming solutions using Snowflake, Databricks, and cloud-native tools .
- Cloud Data Platforms: Deploy and manage data warehousing, analytics, and lakehouse solutions on AWS (Redshift, Glue, S3), Azure (Synapse, ADF, Data Lake), GCP (BigQuery, Dataflow).
- AI ML Integration: Collaborate with data scientists to integrate AI/ML models into data pipelines and optimize analytics workflows .
- Data Governance Security: Implement data governance frameworks , compliance (GDPR, CCPA), role-based access controls , and best practices for security across multi-cloud environments.
- Technical Leadership: Lead and mentor a team of data engineers, define best practices , and drive innovation in data engineering strategies.
- Performance Optimization: Ensure cost-efficient and high-performance data processing , leveraging Spark, DBT, and cloud-native tools .
- Cross-Cloud Integration: Design interoperable solutions that leverage multi-cloud capabilities for data movement, transformation, and analytics.
- Stakeholder Management: Collaborate with business leaders, data analysts, and engineering teams to deliver data-driven solutions aligned with business needs.
Required Skills Qualifications:
Core Technical Skills:
Cloud Data Ecosystems:
- Hands-on expertise with AWS (Redshift, Glue, S3, Lambda), Azure (Synapse, Data Lake, ADF), GCP (BigQuery, Dataflow, Pub/Sub) .
- Experience in multi-cloud data strategy and interoperability .
Data Engineering Pipelines:
- Strong experience in ETL/ELT development using Snowflake, Databricks, Apache Spark, DBT, Airflow .
- Expertise in real-time and batch data processing (Kafka, Spark Streaming, Flink).
Databases Warehousing:
- Strong knowledge of SQL and NoSQL databases (Snowflake, BigQuery, Redshift, DynamoDB, Cosmos DB, MongoDB).
- Deep experience in Data Lakehouse architecture using Databricks, Delta Lake .
Programming Automation:
- Proficiency in Python, SQL, Scala, Java for data transformations and automation.
- Experience with IaC (Terraform, CloudFormation, ARM templates) for provisioning cloud data infrastructure.
Data Governance Security:
- Knowledge of RBAC, IAM, encryption, GDPR/CCPA compliance in cloud environments.
- Experience with data cataloging, lineage, and metadata management .
AI ML for Data Pipelines:
- Hands-on experience integrating AI/ML models into production data pipelines.
Leadership Soft Skills:
- Strong leadership and mentorship experience in managing high-performing engineering teams.
- Ability to translate business needs into technical requirements and deliver solutions.
- Excellent stakeholder communication and collaboration skills.
- Experience in agile methodologies and DevOps for data engineering.