Must Have : Hadoop , Linux , Deployment , Big Data , HDFS
Responsibilities :
- Installing Hadoop in Linux environment.
- Experience with Data Bricks and On-premise applications is mandatory
- Deployment in a Hadoop cluster and its maintenance.
- Health check of a Hadoop cluster monitoring whether it is up and running all the time.
- Analyse the storage data volume and allocating the space in HDFS.
- Resource management in a cluster environment. This involves new node creation and removal of unused ones.
- Configuring Name Node to ensure its high availability
- Implementing and administering Hadoop infrastructure on an ongoing basis.
- Required hardware and software deployment in Hadoop environment, furthermore to expanding of existing environments.
- Software installation and its configuration.
- Backup and recovery operation of Database.
- Checking database connectivity and its security measurements.
- Performance monitoring and fine tuning on actual basis.
- Managing and optimizing disk space for handling data
- Installing patches and upgrading software as and when needed.
- Automate manual tasks for faster performance.
- A Hadoop administrator loads a large volume of data
- User creation in Linux for Hadoop and its components in the ecosystem. Moreover, setting up Kerberos principals is a part of Hadoop administration.
- Performance tuning and running jobs in Hadoop clusters.
- Capacity planning
- Monitoring connectivity and security of Hadoop cluster
- Managing and reviewing log files in Hadoop.
- Management of HDFS file system and monitoring them.
- Maintaining HDFS and providing necessary supports.
- Backup and recovery tasks.
- Communicating with other development, administrating and business teams. They include infrastructure, application, network, database, and business intelligence teams. Effective communication plays a key role in high quality and availability of data.
- Coordinating with application teams. Installing the operating system and Hadoop related updates as and when required.
- Working as a key person for Vendor escalation
Certificate : Need Spark Developer Certification (HDPCD) and HDP Certified Administrator (HDPCA) certified Hadoop Administrator (MANDATORY).
Keyskills: Big Data administration Cloud Storage Software installation Linux Hadoop Cluster Management Big Data HDFS Azure Databricks Data Storage
J.P. Morgan Chase & Co. is an American multinational investment bank and financial services company headquartered in New York City. JP Morgan Chase is the largest bank in the United States, and is ranked by S&P Global as the sixth largest bank in the world by total assets as of 2018, to th...