• Hadoop Administrator (Contract)

Industry IT
Experience Range 3 - 5 Years

Job Description
About Us
We are a top 8 global IT services company with operations in 50+ countries. We offer an advanced portfolio of application, business process, cloud, and infrastructure services to businesses and governments worldwide. Our roots cross continents and cultures, dating back five decades. We've grown organically and decisively by acquiring some of the best IT services providers across the globe. This pedigree yields a characteristic special to NTT DATA: The opportunity of a global brand with the creative energy of a start-up. As a Global IT Innovator, innovation is at the heart of what we do. Innovation that makes an impact and improves business performance. Innovation that improves our clients' bottom lines. We're looking for innovators to join our team. NTT DATA Singapore PTE Ltd is a wholly owned subsidiary of NTT DATA Corp, a part of NTT Group, the world’s 65th Largest Company according to Fortune Magazine. The Singapore entity of NTT DATA is an S10 government-registered supplier and forms the core of NTT DATA’s APAC operations and is positioned as a gateway of our global capabilities in Singapore and APAC region.
Roles and Responsibility

Job Description

  • Oversees implementation and ongoing administration of Hadoop infrastructure and systems Manages Big Data components/frameworks such as Hadoop, Spark, Hadoop Distributed File System (HDFS), Kafka, Elastic search etc.
  • Analyzes latest Big Data analytic technologies and innovative applications in both business intelligence analysis and new offerings
  • Aligns with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and expanding existing environments
  • Perform POCs of new capability in Hadoop Platform
  • Handles performance tuning of Hadoop clusters and Hadoop MapReduce routines
  • Screens Hadoop cluster job performances and capacity planning
  • Monitors Hadoop cluster connectivity and security
  • Manages and reviews Hadoop log files
  • Handles HDFS and file system management, maintenance, and monitoring
  • Partners with infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability
  • Collaborates with application teams to install operating system and Hadoop updates, patches, and version upgrades when required
  • Assist Data Engineering and Data Science teams with troubleshooting of performance issues and general job failure troubleshooting
  • Monitor and manage the server services, name node, data node, journal nodes, resource manager, node managers, Spark UI’s for job status etc.
  • Assist with the creation of CI/CD pipelines in the Hadoop environment
  • Develop Linux and python scripts to enhance automation of platform


Recruiter Name Elaine Lee
A+| A| A-