Dear Candidate,
Greetings of the day!!!
Location- Bangalore, Hyderabad , Pune and Chennai
Experience- 3.5 years to 13 years
Short JD-
Job Description
Key skills- Spark or Pyspark or Scala (Any big data skill is fine) All skills are good to have.
Desired Competencies (Technical/Behavioral Competency)
Must-Have** (Ideally should not be more than 3-5)
1. Minimum 3-12 years of experience in build & deployment of Bigdata applications using SparkSQL, SparkStreaming in Python;
2. Minimum 2 years of extensive experience in design, build and deployment of Python-based applications;
3. Design and develop ETL integration patterns using Python on Spark. Develop framework for converting existing PowerCenter mappings and to PySpark(Python and Spark) Jobs.
Expertise on graph algorithms and advanced recursion techniques.
Hands-on experience in generating/parsing XML, JSON documents, and REST API request/responses.
Good-to-Have
Keyskills: Big Data Pyspark Hive Sqoop SCALA Hadoop Spark Hdfs