Job Description
Description:.
Mandatory skills.
Strong Linux skills.
Good knowledge of AWS and GCP cloud.
Good knowledge of Terraform, Java and shell scripting.
Good understanding of Kafka, Zookeeper, Hadoop, HBase, Spark and Hive.
Good understanding of Elasticsearch.
knowledge of Aerospike would be an added advantage.
should have worked on setting up and managing Big data platforms on AWS or GCP cloud.
Good to have skills.
Knowledge of Druid, Airflow and Tableau.
Requirements:.
AWS Google Cloud.
Terraform.
Shell-Scripting.
Hadoop.
Apache.
Spark.
Hive.
Preferences:.
DevOps.
Big Data.
GCP.
Job Responsibilities:.
Mandatory Skills.
Strong Linux skills.
Good knowledge of AWS and GCP cloud.
Good knowledge of Terraform, Java and shell scripting.
Good understanding of Kafka, Zookeeper, Hadoop, HBase, Spark and Hive.
Good understanding of Elasticsearch.
knowledge of Aerospike would be an added advantage.
should have worked on setting up and managing Big data platforms on AWS or GCP cloud.
Job Classification
Industry: Courier / Logistics
Functional Area / Department: Engineering - Software & QA
Role Category: DevOps
Role: DevOps Engineer
Employement Type: Full time
Contact Details:
Company: Hitachi
Location(s): Bengaluru
Keyskills:
gcp
devops
kafka
shell scripting
hbase
gcp cloud