Desired Candidate Profile
At least 5 years of experience working in Big Data Technologies
At least 5 years of experience in Data warehousing
At least 4+ years of experience in core Java and its ecosystem
Strong understanding and hands-on experience on the Big Data stack (HDFS, Sqoop, Hive, Java etc.)
Big Data solution design and architecture
Design, sizing and implementation of Big Data platforms based on Cloudera or Hortonworks
Deep understanding of Cloudera and or Hortonworks stack (Spark, Installation and configuration, Navigator, Oozie, Ranger etc.)
Experience in extracting data from feeds into Data Lake using Kafka and other open source components
Understanding of and experience in Data ingestion patterns and experience with building pipelines.
Experience in configuring Azure or AWS components and managing data flows. Knowledge of Google Cloud Platform a plus.
Experience work on Production grade projects with Terabyte to Petabyte size data sets.
Education:
UG: Any Graduate
PG: Any Postgraduate - Any Specialization
Contact Details:
Keyskills:
Hadoop
ETL
Cloudera
Hive
Spark
Core Java
Talend
Hdfs
Sqoop
Oozie