Job Description
"Roles and Responsibilities:
-Experience in various BigData platforms (Kafka, GPU, ETL-Talend, ANZO)
-Exposure into Hive, Spark, Tensorflow and other components
-Hands on experience in High availability cluster setup on the above platforms
-Experience in Capacity planning and job performance tuning
-Kerberos setup and configurations in Cloud and On-Premise environments
-Good knowledge in networking concepts like dhcp, DNS, NFS, IT Security & Patch management
-Exposure on Monitoring tools like Splunk, ICINGA , Grafana etc
-Must to have good experience in Linux server administration, Linux system management (RedHat satellite, system)
-Good knowledge in one of the Relational databases, e.g. MySQL, Oracle
-Should know one of the Languages: Python, Perl, Shell
Automated deployment (xCAT, ansible, puppet, or similar)"
Keywords KAFKA, GPU, ETL TALEND, ANZO .
Work Experience Required 3 to 4 Years
CTC offered 9 to 12 lps
Job Location(s) Bangalore,Pune,Hyderabad.
Qualification BE/B.TECH/MCA/Any other relevant degree
Employement Category:
Employement Type: Full time
Industry: IT - Software
Role Category: DBA / Datawarehousing
Functional Area: Not Applicable
Role/Responsibilies: Need immediate joiners - Big Data
Contact Details:
Company: Hucon Solutions India
Location(s): Hyderabad
Keyskills:
puppet
gpu
ansible
kafka
anzo
etl talend