We are currently looking for an experienced Hadoop Administrator, preferably on Hortonworks distribution platform to become a part of the Database Operations team. This team provides database operations support for Distributed platform consisting of all major RDBMS technologies and a growing field of Big Data Analytics platform. The candidate should be a self-learner, a good team player and has passion for Big data technologies.
* Administer and maintain Hortonworks Hadoop clusters across all environments – Production, UAT, Development and DR.
* Installation/upgrade/configuration of the Hortonworks Hadoop platform.
* Proactively monitor cluster health and perform performance tuning activities.
* Perform capacity planning and expansion activities working across Infrastructure and other Enterprise Services teams
* Perform cluster maintenance with patching, backup/recovery, user provisioning, automation of routine tasks, troubleshooting of failed jobs, configure and maintain security policies.
* Typically has 5 – 7 years of related IT experience
* Bachelor’s Degree in Information Technology, Engineering, Computer Science, related field or equivalent work experience.
* 3+ years of hands on experience with enterprise level Hadoop administration (Hortonworks preferred).
* Strong understanding of one or more major RDBMS technologies like Oracle, DB2, MS-SQL etc.
* Understanding of High Availability, Disaster Recovery and Storage technologies like Data replication, SRDF/non-SRDF
* Strong Linux/Unix and Networking experience
* In depth understanding of Hadoop ecosystem like HDFS, YARN, MapReduce, Hive, Pig, Spark, Sqoop, Solr, kafka, oozie , Knox etc.
* Experience in installation/configuration Hadoop clusters from scratch.
* Experience in upgrading Hadoop clusters using rolling upgrade/express upgrade methods.
* Experience in troubleshooting and analyzing Hadoop cluster services/component failures and job failures.
* Experience in setup, configuration and management of security for Hadoop clusters using Kerberos and integration with LDAP/AD at an Enterprise level.
* Experience with setting up Ranger policies for HDFS and Hive.
* Implementation experience in setting and configuring backup and recovery procedure on Hadoop.
* Implementation experience in setting and configuring High availability and Disaster Recovery procedures.
* Experience in managing Hadoop cluster with Ambari and developing custom tools/scripts to monitor the Hadoop Cluster health.
* Experience in configuration and tuning of various components like HDFS, YARN, Hive, Spark.
* Experience in documenting Runbooks, and other operational documentation.
* Any experience with No-SQL database platforms like mongoDB or MarkLogic will be a big plus
Today, Freddie Mac makes home possible for one in four home borrowers and is one of the largest sources of financing for multifamily housing. Join our smart, creative and dedicated team and you’ll do important work for the housing finance system and make a difference in the lives of others. Freddie Mac is an equal opportunity and top diversity employer. EOE, M/F/D/V.