Hadoop Administration (Austin, TX, Charlotte, NC, Chicago, IL, Cincinnati, OH, Cupertino, CA, Foster City, CA, Houston, TX, Sunnyvale, CA, Tampa, FL)

SonSoft

Industry
Experience
Skills
Workhours
Responsibilities

Company Description

Sonsoft , Inc. is a USA based corporation duly organized under the laws of the Commonwealth of Georgia. Sonsoft Inc. is growing at a steady pace specializing in the fields of Software Development, Software Consultancy and Information Technology Enabled Services.

Job Description

Job Description:-

  • At least 4 years of experience in Implementation and Administration of Hadoop infrastructure
  • At least 2 years of experience Architecting, Designing, Implementation and Administration of Hadoop infrastructure
  • At least 2 years of experience in Project life cycle activities on development and maintenance projects.
  • Should be able to provide Consultancy to client / internal teams on which product/flavor is best for which situation/setup
  • Operational expertise in troubleshooting , understanding of system’s capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks
  • Hadoop, MapReduce, HBase, Hive, Pig, Mahout
  • Hadoop Administration skills: Experience working in Cloudera Manager or Ambari, Ganglia, Nagios
  • Experience in using Hadoop Schedulers - FIFO, Fair Scheduler, Capacity Scheduler
  • Experience in Job Schedule Management - Oozie or Enterprise Schedulers like Control-M, Tivoli
  • Good knowledge of Linux (RHEL, Centos, Ubuntu)
  • Experience in setting up Ad/LDAP/Kerberos Authentication models
  • Experience in Data Encryption technique

Responsibilities:-

  • Upgrades and Data Migrations
  • Hadoop Ecosystem and Clusters maintenance as well as creation and removal of nodes
  • Perform administrative activities with Cloudera Manager/Ambari and tools like Ganglia, Nagios
  • Setting up and maintaining Infrastructure and configuration for Hive, Pig and MapReduce
  • Monitor Hadoop Cluster Availability, Connectivity and Security
  • Setting up Linux users, groups, Kerberos principals and keys
  • Aligning with the Systems engineering team in maintaining hardware and software environments required for Hadoop
  • Software installation, configuration, patches and upgrades
  • Working with data delivery teams to setup Hadoop application development environments
  • Performance tuning of Hadoop clusters and Hadoop MapReduce routines
  • Screen Hadoop cluster job performances and capacity planning
  • Data modelling, Database backup and recovery
  • Manage and review Hadoop log files
  • File system management, Disk space management and monitoring (Nagios, Splunk etc)
  • HDFS support and maintenance
  • Planning of Back-up, High Availability and Disaster Recovery Infrastructure
  • Diligently teaming with Infrastructure, Network, Database, Application and Business Intelligence teams to guarantee high data quality and availability
  • Collaborating with application teams to install operating system and Hadoop updates, patches and version upgrades
  • Implementation of Strategic Operating model in line with best practices
  • Point of Contact for Vendor escalations
  • Ability to work in team in diverse/ multiple stakeholder environment
  • Analytical skills
  • Experience and desire to work in a Global delivery environment

Qualifications

Basic Qualifications :-

  • Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
  • At least 7 years of experience within the Information Technologies.

Additional Information

Note:-
This is a Full-Time & Permanent job opportunity for you.
Only US Citizen, Green Card Holder, GC-EAD, H4-EAD & L2-EAD can apply.
No OPT-EAD, H1B & TN candidates please.
Please mention your Visa Status in your email or resume.

I'm interested
Read Full DescriptionHide Full Description
Confirmed 9 hours ago. Posted 30+ days ago.

Discover Similar Jobs

Suggested Articles

One Step Register
Need an account? Sign Up