Job Title:- Data Engineer with Pyspark Location: - Kolkata Job Type 3 days from office (Hybrid) Responsibilities Minimum 4 to 14 years of experience Minimum 4 years of experience in build & deployment of Bigdata applications using PySpark 2+ years of Experience with AWS Cloud on data integration with Spark & AWS Glue/EMR In-depth understanding of Spark architecture & distributed systems Good exposure to Spark job optimizations Expertise in handling complex large-scale Big Data environments Able to design, develop, test, deploy, maintain, and improve data integration pipeline Mandatory Skills 4+ years of exp in PySpark 2+ years of exp in AWS Glue/EMR Strong knowledge on SQL is required Excellent written & spoken communication skills, and time management skills. Nice-to-Have Any cloud skills Any ETL knowledge,
Employement Category:
Employement Type: Full time Industry: IT Services & Consulting Role Category: Not Specified Functional Area: Not Specified Role/Responsibilies: Data Engineer with Pyspark Job in Cognizant at