Choco is on a mission to enable the global food system to become sustainable by optimizing the way food is sold, ordered, distributed, and financed. Our AI-focused software connects distributors with their customers to operate waste-free and efficiently. A problem of this magnitude requires a massive scale and only the best people will be able to solve it. Are you in?
Here’s what we’re up to: https://bit.ly/4fyXonB
No recruiters please, we have a dedicated in-house Talent team.
Senior Data Platform Engineer
We are seeking a Senior Data Platform Engineer ready to scale data infrastructure at the heart of Choco.
Over the past two years, Choco has grown from an app-based ordering product into an AI-powered, data-driven company, powering everything from sales workflows and analytics to machine learning and AI systems. Today, every product decision, ML or AI model, and different customer tools depend on the data platform we’ve built.
We’re now looking for an experienced, pragmatic, and hands-on Senior Data Platform Engineer to bring the next level of scale, reliability, and usability to our data systems. You’ll join a small, high-impact team that owns all of Choco’s data infrastructure: From ingestion and transformation to reverse ETL, ML feature pipelines, observability, and AI deployment tooling.
This is not a data janitor role. You’ll be designing systems, writing production code, shaping infrastructure decisions, and working closely with analytics, ML, and product teams. You’ll bring clarity to messy systems, and help us go from “data is available” to “data is a competitive advantage.”
At Choco, we move fast and solve real problems. Our platform powers three core data use cases:
You will:
We’re looking for a strong and experienced engineer with demonstrated technical leadership, deep infrastructure thinking, a delivery mindset, and the ability to navigate ambiguity. You know how to scale a data platform not just in volume - but in usability, reliability, and impact.
We strike a good balance between building solutions in-house and adopting tools. In this role, you will be expected to ship code on a daily basis.
Our Lakehouse is built on top of AWS S3, with files stored as Delta and Databricks SQL as our main SQL execution engine.
The tools we have adopted so far are: dbt, Airflow, Athena, Databricks SQL, PySpark, DynamoDB, Kafka, Python, Docker, MLflow, Looker, Unity Catalog
We run on AWS and use the following AWS products: Glue, SNS, SQS, Lambda, Athena, EMR, Batch, Kinesis Firehose
Choco was founded in 2018 in Berlin. Now, we are a dedicated team of over 200 Chocorians across Europe and the US. We seek hungry and humble individuals who embrace hard work, put our team first, and are committed to building a lasting company. Our mission demands urgency and speed while maintaining a long-term vision.
In just five years, Choco has raised $328.5 million and achieved unicorn status in 2022, with a valuation of $1.2 billion. We're supported by some of the world’s best investors like Bessemer Venture Partners, Insight Partners, Coatue Management, and LeftLane Capital.
Choco is an equal opportunity employer. We encourage people from all backgrounds to apply. We are committed to ensuring that our technology is available and accessible to everyone. All employment decisions are made without regard to race, color, national origin, ancestry, sex, gender, gender identity or expression, sexual orientation, age, genetic information, religion, disability, medical condition, pregnancy, marital status, family status, veteran status, or any other characteristic protected by law.
Read Full Description