Enterprise Data Operations Assoc Manager

PepsiCo

Overview

As a member of the data engineering team, you will be the key technical expert developing and overseeing PepsiCo's data

product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business.

You'll be an empowered member of a team of data engineers who build data pipelines into various source systems, rest data on the

PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts

across the company. As a member of the data engineering team, you will help lead the development of very large and complex data

applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data

products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process

owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premise data sources as well

as cloud and remote systems

Responsibilities

  • Active contributor to code development in projects and services.
  • Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality

across data products.

  • Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality and

performance.

  • Responsible for implementing best practices around systems integration, security, performance and data management.
  • Empower the business by creating value through the increased adoption of data, data science and business intelligence landscape.
  • Collaborate with internal clients (data science and product teams) to drive solutioning and POC discussions.
  • Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects and strategic internal

and external partners.

  • Develop and optimize procedures to “productionalize” data science models.
  • Define and manage SLA’s for data products and processes running in production.
  • Support large-scale experimentation done by data scientists.
  • Prototype new approaches and build solutions at scale.
  • Research in state-of-the-art methodologies.
  • Create documentation for learnings and knowledge transfer.
  • Create and audit reusable packages or libraries

Qualifications

  • 6+ years of overall technology experience that includes at least 4+ years of hands-on software development, data engineering, and

systems architecture.

  • 4+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools.
  • 4+ years of experience in SQL optimization and performance tuning, and development experience in programming languages like

Python, PySpark, Scala etc.).

  • 2+ years in cloud data engineering experience in Azure.
  • Fluent with Azure cloud services. Azure Certification is a plus.
  • Experience with integration of multi cloud services with on-premises technologies.
  • Experience with data modeling, data warehousing, and building high-volume ETL/ELT pipelines.
  • Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations.
  • Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets.
  • Experience with at least one MPP database technology such as Redshift, Synapse or SnowFlake.
  • Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes.
  • Experience with version control systems like Github and deployment & CI tools.
  • Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools.
  • Experience with Statistical/ML techniques is a plus. · Experience with building solutions in the retail or in the supply chain space is a plus · Understanding of metadata management, data lineage, and data glossaries is a plus. · Working knowledge of agile development, including DevOps and DataOps concepts. · Familiarity with business intelligence tools (such as PowerBI). · BA/BS in Computer Science, Math, Physics, or other technical fields.
Read Full Description
Confirmed 13 minutes ago. Posted 30+ days ago.

Discover Similar Jobs

Suggested Articles