This job is unavailable.

Big Data DevOps Lead

Zeta Global

Remote

Summary:

As a Senior or Lead Big Data DevOps Engineer, you will be working with a team responsible for setting up, scaling, and maintaining Big Data infrastructure and tools in private and public cloud environments.

Main Responsibilities:

  • Driving improvement of the efficiency of Big Data infrastructure.
  • Coordinating cross-team infrastructure and Big Data initiatives.
  • Leading Big Data – related architecture and design efforts.
  • Ensuring availability, efficiency, and reliability of the Big Data infrastructure.
  • Building and supporting tools for operational tasks.
  • Evaluating, designing, deploying monitoring tools.
  • Design and implementation of DR/BC practices and procedures.
  • On-call support of production systems.

Requirements:

  • 7+ years of experience working with Hadoop, preferably Open Source.
  • 3+ years of leading Big Data, DevOps, SRE, DBA, or development team.
  • Experience setting up and running Hadoop clusters of 1000+ nodes.
  • Solid knowledge of NoSQL databases, preferably Cassandra or ScyllaDB.
  • Experience running and troubleshooting Kafka.
  • Working knowledge of at least one of: Terraform, Ansible, SaltStack, Puppet.
  • Proficiency in shell scripting.

Nice to have:

  • Experience with Prometheus.
  • Experience managing Showflake.
  • Solid knowledge of Graphite and Grafana.
  • Python or Perl scripting skills.
  • Experience with installing and managing Aerospike.
  • DBA experience with one of: PostgreSQL, MySQL, MariaDB.
Read Full Description
Confirmed 6 hours ago. Posted 30+ days ago.

Discover Similar Jobs

Suggested Articles