Job Overview
Join the Operational Data Lake (ODL) team that plays a crucial role in supporting Intuit's mission by providing a comprehensive view of how its applications are performing, enabling data-driven decisions to enhance customer experiences and improve operational efficiency.
Our charter is to enable Intuit’s engineers and leaders to have
- Access all the operational data they need, when they need it, to understand how our products are performing in the real world.
- Easily track every step of the software development lifecycle, from code commits to customer adoption, and identify areas for improvement.
- Contribute to building a cutting-edge data platform that empowers data-driven decisions across the entire company.
Responsibilities
- Architect, design, and build fault-tolerant, scalable big-data platforms that support high-velocity, high-volume data use cases.
- Collaborate cross-functionally with Product Management, Engineering, and Central Data Lake teams to guarantee the platform aligns with current requirements and maintains extensibility for future functionalities.
- Develop and oversee scalable architectures for data normalization, lineage, governance, ontology, and discoverability, guaranteeing efficient integration throughout diverse systems and potentially disparate business units.
- Data and Artificial Intelligence Enablement: Collaborate with domain event producers and product teams to determine, curate, and administer datasets essential for sophisticated analytics, encompassing Generative AI and Machine Learning, and deriving actionable customer intelligence.
- Execute code reviews, promote coding best practices, and establish processes for unit testing, continuous integration/continuous delivery, performance testing, capacity planning, documentation, monitoring, alerting, and incident response.
- Demonstrate leadership and cultivate an environment of ongoing education, mentoring junior team members in technical proficiency and collaborative problem resolution.
Qualifications
- 8+ years of experience working with product analytics, web analytics or other similar data analytics domains
- Advanced proficiency in SQL, “big data” technologies (e.g., Redshift, Spark, Hive, BigQuery), and BI tools (e.g., Tableau, Qlik, Dash)
- Strong programming skills in Python or Java/Scala
- Experienced in engineering data pipelines and workflow orchestration tools, Architecting E2E big data and analytical platforms
- Understanding of AI-native architectures and GenAI platforms; able to assess implications for data, testing, and behavior
- Strong business acumen and the ability to translate business strategy into testable hypotheses and learning agendas
- Strong data storytelling skills, with a proven ability to rapidly construct impactful reports, communicate insights and influence leadership
- Excellent communication and interpersonal skills, with a proven ability to build trust and collaborate seamlessly across technical, business, and cross-functional teams.
- Comfortable working in a fast-paced environment and have flexibility to shift priorities when needed
- Bachelor’s degree in Engineering, Data Science, Statistics, Mathematics, Computer Science, Economics or related quantitative field; Master’s Degree preferred
Read Full Description