Job Description
Dear Candidate,
Greetings of the day!!
I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: ka*************m@te******o.netTechmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology.
We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra Clients Vision is our Mission.
We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Job Title: Technical GCP Data Architect/Lead
Location: Madurai
Experience: 12+ Years
Notice Period: Immediate
Job Summary
We are seeking a hands-on Technical GCP Data Architect/Lead with deep expertise in real-time streaming data architectures to help design, build, and optimize data pipelines in our Google Cloud Platform (GCP) environment. The ideal candidate will have strong architectural vision and be comfortable rolling up their sleeves to build scalable, low-latency streaming data pipelines using Pub/Sub, Dataflow (Apache Beam), and BigQuery.
Key Responsibilities
- Architect and implement end-to-end streaming data solutions on GCP using Pub/Sub, Dataflow, and BigQuery.
- Design real-time ingestion, enrichment, and transformation pipelines for high-volume event data.
- Work closely with stakeholders to understand data requirements and translate them into scalable designs.
- Optimize streaming pipeline performance, latency, and throughput.
- Build and manage orchestration workflows using Cloud Composer (Airflow).
- Drive schema design, partitioning, and clustering strategies in BigQuery for both real-time and batch datasets.
- Define SLAs, monitoring, logging, and alerting for streaming jobs using Cloud Monitoring, Error Reporting, and Stackdriver.
- Experience with the data modeling.
- Ensure robust security, encryption, and access controls across all data layers.
- Collaborate with DevOps for CI/CD automation of data workflows using Terraform, Cloud Build, and Git.
- Document streaming architecture, data lineage, and deployment runbooks.
Required Skills & Experience
- 10+ years of experience in data engineering or architecture.
- 3+ years of hands-on GCP data engineering experience.
- Strong expertise in:
- Google Pub/Sub
- Dataflow (Apache Beam)
- BigQuery (including streaming inserts)
- Cloud Composer (Airflow)
- Cloud Storage (GCS)
- Solid understanding of streaming design patterns, exactly-once delivery, and event-driven architecture.
- Deep knowledge of SQL and NoSQL data modeling.
- Hands-on experience with monitoring and performance tuning of streaming jobs.
- Experience using Terraform or equivalent for infrastructure as code.
- Familiarity with CI/CD pipelines for data workflows.
Job Classification
Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Technical Lead
Employement Type: Full time
Contact Details:
Company: Techmango Technology
Location(s): Madurai
Keyskills:
GCP
Data Flow
Airflow
Bigquery
Hadoop
Apache Beam
Oracle Data Warehouse
Data Catalog
Teradata