Hadoop Administrator (Linux/Unix) - Jhb - Long Term Contract To Perm
Rainmaker Acquisiton Cc
Johannesburg, Gauteng
3d ago
source : findojobs-za
  • 8+ years of overall experience in systems management and administering Linux / Unix server hardware and operating system.
  • Requires a Bachelors degree in computer science or related field and 4-10 years of experience in end-to-end Data Warehouse application development and deployment.
  • Well versed in installing & managing distributions of *Hadoop *(CDH3, CDH4, MapR, Hortonworks etc.).
  • 3+ Years of experience in Architecting *Hadoop* / Big Data applications and environments.
  • Minimum 2-3 years of experience in deploying and administering a multi-petabyte *Hadoop* cluster and managing HDFS, Hue and all the related *Hadoop* tools.
  • Expertise and hand-on experience with the *Hadoop* ecosystem MapReduce, HDFS, HBase and Hive / Pig.
  • Expertise in Python / PERL / Shell scripting and hands-on programming skills.
  • Experience in Linux / Unix technologies and Shell scripting along with Perl or Python. Experience in Java a plus.
  • Advanced knowledge in performance troubleshooting and tuning *Hadoop* clusters.
  • Deep understanding and experience with *Hadoop* / Big Data Concepts and Technologies.
  • Good knowledge of *Hadoop* cluster connectivity and security.
  • Sound knowledge of relational databases (SQL). Experience with large SQL based systems like Teradata is a plus.
  • Experience in troubleshooting Map Reduce job failures and issues with Hive, Pig, HBase etc.
  • Hands-on experience with large scale Big Data environments build, capacity planning, performance tuning and monitoring.
  • Design, configure and manage the backup and disaster recovery for *Hadoop* data.
  • Familiar with industry best practices and how to drive efficiencies while maintaining a robust service offering.
  • Development experience in Hive, PIG, HBase is desired. Hands-on development experience with Java programming is a plus.
  • Strong IT consulting experience in handling huge data volumes, architecting big data environments.
  • Excellent knowledge of *Hadoop* integration points with enterprise BI and EDW tools. Familiarity with installing and configuring monitoring tools for the *Hadoop* environment.
  • Report this job
    checkmark

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Apply
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Continue
    Application form