Job Description
Position Description:. Over 5+ years of experience in development of AWS, SQS, Glue, python, pyspark, dynamic frame, dataframe
. Demonstrated competency with the following AWS services: ECS, EC2, ECR, S3, RDS, VPC, IAM, QuickSight, CloudFront, CloudFormation, CloudWatch, SQS, ElastiCache, Lambda, Glue ETL , Dynamodb , Neptune , route53 , Application Load balancers , ECS , SQS
. Strong knowledge on developing code using terraform (IAAC) tool
. Experienced in using GIT SCM tool
. Good hands on experience on using jenkins(or) Gitlab CI tool, worked as admin or developer depending on day to day scenarios
. Good hands on experience on containerization tools like docker,kubernetes
. Strong experience on handling containers,pods,clusters in a robust environments
. Strong Data Engineering background with the ability to implement Spark based or AWS Lambda based or glue based ETL jobs Knowledge of operating system administration
. Comprehensive knowledge regarding contemporary processes and methodologies for development and operations
. Strong understanding of how to secure AWS environments and meet compliance requirements
. AWS Disaster Recovery design and deployment across regions a plus
. Experience with multi-tier architectures: load balancers, caching, web servers, application servers, databases, and networking
. Great communication and interpersonal skills
Main location: Hyd/chennai/Bangalore
Your future duties and responsibilities:
- Design, develop, and maintain scalable ETL pipelines using AWS Glue, Python, and PySpark
- Build data ingestion and transformation workflows leveraging DynamicFrame/DataFrame APIs
- Develop serverless and event-driven solutions using AWS Lambda and SQS
- Manage data storage solutions using S3, DynamoDB, and RDS with secure IAM governance
- Deploy and automate cloud infrastructure using Terraform (IaC) and CI/CD tools
- Containerize and orchestrate applications using Docker and Kubernetes for robust execution environments
- Monitor system performance, troubleshoot production issues, and optimize data workflows
- Collaborate with cross-functional teams to deliver high-quality cloud data solutions
- Apply AWS security best practices and compliance guidelines in production environments
- Support multi-tier architectures across application, data, and networking layers
Required qualifications to be successful in this role:
Must to Have:
- 5+ years of hands-on experience with AWS Cloud services such as Glue, SQS, Lambda, S3, DynamoDB, EC2, VPC, IAM.Strong development experience in Python and PySpark for ETL workloads.
- Expertise in DynamicFrame and DataFrame-based data processing in Glue.
- Hands-on experience with Terraform for infrastructure automation.Proficiency with CI/CD and version control tools: Git, Jenkins, or GitLab CI.
- Strong fundamentals in Linux/OS administration and cloud networking.Good knowledge of Docker/Kubernetes and containerized deployments.
- Solid understanding of cloud security and operational best practices.
Good to Have:
- Experience with AWS Neptune, QuickSight, and CloudFront.Knowledge of AWS Disaster Recovery strategies and multi-region high availability.
- Familiarity with Step Functions, EventBridge, and microservices design patterns.
- Experience managing load balancing, caching, and distributed architectures.
- AWS Certifications (Developer, Solutions Architect, DevOps Engineer).
- Strong communication, problem-solving, and collaboration skills.
Skills:
Job Classification
Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Data Platform Engineer
Employement Type: Full time
Contact Details:
Company: CGI
Location(s): Hyderabad
Keyskills:
Python
rds
kubernetes
iac
devops engineer
microservices
docker
lambda
git
sqs
iam
spark
ec2
devops
linux
jenkins
software engineer
cloudformation
s3
cloud security
vpc
python developer
compliance
gitlab
terraform
aws