Software Principal Technical Expert - 009FTG
As a Data Engineer, you’ll play a crucial role in managing the entire lifecycle of data pipelines and integrating data visualization solutions.
Data Pipeline Creation:
- Collaborate with Product Owners and business representatives to define new data pipelines technically.
- Evaluate the complexity associated with each ambition and the required skills.
- Work closely with subject matter experts to determine the necessary scope of data for fueling the pipelines.
- Design and co-build the architecture for ingesting data, considering data cleansing and preparation based on its origin.
- Implement the data transformation layer, leveraging analytics or AI features.
- Establish connections with external services to expose output data.
Data Visualization:
- Utilize Power BI capabilities to share various datasets with internal teams and external customers.
- Develop dashboards for automating the monitoring of AI features and adoption.
- Create customer reports that showcase data transformation results, following guidance from our business representatives and Connected Services Hubs remote support agents.
- Actively contribute to integrating Power BI into the program for creating a new reporting experience.
- Collaborate with UX designers to structure reporting concepts (events, evidence, recommendations).
- Identify all detailed data models.
- Build a semantic database to anticipate future needs, including Generative AI Co-Pilot integration.
- Deploy and maintain report templates.
DevOps / MLOps role:
- Masters model deployment and upgrade in the appropriate environment (Databricks, Dataiku, others…)
- Knowledgeable around the correct implementation of those environment in Azure (or similar environment like AWS or GCP)
- Act as a single point of contact for the analytics team with external technical DevOps organization (Advisor Engineering, AI Hub)
Qualifications
- Experience 10-12+ years
- Python Programming (5/5): Strong programming skills are essential for data manipulation, transformation, and pipeline development.
- ETL Extract Transfer Load (5/5): Significative experience is required
- Data Preparation and Modeling (5/5): Understanding various kind of data structures will be valuable in the role.
- Sql / Non Sql DB Management (4.5/5): Proficiency in Database Management systems
- MLOps (4/5): Understanding ML concepts for integrating ML models into pipelines.
- DevOps (Azure) (4/5): Familiarity with Azure cloud services for scalable data storage and processing.
- Databricks (3/5): Proficiency with Databricks environment will be valuable
- Dataiku (1.5/5) : an experience with Dataiku platform could be a plus
Power BI usage (5/5): Ability to create insightful visualizations in Power BI is mandatory
Primary Location
: IN-Karnataka-Bangalore
Schedule
: Full-time
Unposting Date
: Ongoing
Read Full Description