In the frame of the ongoing transformation at Group level, technology adoption and implementation are key success factors, as well as processes transition. As a member of DDPO, Data Engineer will build, deliver and maintain products & data products (data pipelines, services, APIs…) He will work in close collaboration with data product managers to develop new features linked to these products, especially features linked to :

  • Data ingestion, transformation, exposition
  • Data servicing (through API) or restitution (through apps / dashboards)
  • Parallel calculation on big volumes of data

The Data Engineer is responsible for :

  • Building, delivering, maintain, documenting data products & features (data pipelines, data services, APIs…) following state-of-the-art patterns (medaillon architecture, gitflow)
  • Acting as SME and representative of Data Chapter

The Data Engineer needs to:

  • Be problem-solving oriented with strong analytical thinking
  • Be autonomous and rigorous in the way he/she approaches technical challenges
  • Advise on the architecture of end-to-end data flows
  • Collaborate with various stakeholders (product owners, solution owners, data solution analysts, developers, technical leads, architects) to deliver data artefacts in a team spirit
  • Commit and bring his/her skills to contribute to the success of the Group

Under the responsibility of the Head of Data Delivery & Engineering, your mission will be to:

  1. Build, deliver, maintain and document data artefacts or features linked to the products you will be assigned to, under the prioritization of the Data Product Manager you will be working with:
  • Develop data pipelines leveraging ELT / ETL techniques to ingest, transform and display data for well-defined purposes following state-of-the art approach (medaillon architecture for data, gitflow for development, unit tests where applicable…)
  • Tackle key technical questions linked to data, like parallelization, calculation on big volumes of data, optimization of queries, etc.
  • Develop, when relevant, taylor-made services / APIs to expose the data for various means (BI, APIs, services, data science…)
  • Enrich SCOR ontology and document data artefacts (code documentation for data pipelines / services / APIs, contribution to the data definition, processes, etc.)
  • Reuse when relevant components or assets (code, frameworks, data objects) to leverage as much as possible on the DDPO ecosystem
  1. Acting as SME and representative of Data Chapter
  • Contribute to the design of solutions (end-to-end data flows) in close collaboration with Architects, Data Modelers, Product Owners & Managers and advise on best practices to external stakeholders
  • Contribute to the overall data community by sharing good practices, return of experiences, expertise on relevant technology, etc. 
  • Perform peer review amongst other data engineers to ensure consistency and quality of development
  • Be aware of the DDPO ecosystem to best ask advise when relevant to the appropriate experts (Data Modelers, ML Engineers, Architects, etc.)
  1. Additional activities related to your day-to-day mission
  • Ensure a technological watch on Data platform solutions, especially related to data engineering topics
  • Participate to Scrum rituals (dailys, sprint planning, sprint reviews, retrospectives, etc.)
  • Contribute to ICS

Required experience & competencies

  • +5 years of experience as a data engineer, data oriented mindset
  • Proven experience in development and maintenance of data pipelines
  • Good development practices (gitflow, unit tests, documentation…)
  • Proven experience in agile projects (Scrum and/or Kanban)
  • Knowledge of (Re)insurance industry and / or Financial Services is a plus
  • Awareness on data management and data privacy topics

Technical Skills :

  • Strong level in Python and Pyspark, ability to develop data pipelines under various platform (Databricks, Palantir Foundry, …)
  • Strong level in SQL (knowledge of ANSI92 SQL, execution plan analysis)
  • Good knowledge of parallelization, distributed programming techniques
  • Good knowledge of datalakes environments and concepts (delta lakes, medaillon architecture, blob storage vs file shares…)
  • Good knowledge in decisional data modeling (Kimball, Inmon, Data Vault concepts…) and associated good practices (slowly changing dimensions, point in time tables, chasm / fan trap management, change data capture management…)
  • Knowledge of CI / CD pipelines (AzureDevOps, Jenkins, Artifactory, Azure Container Registry…)
  • Knowledge of REST API development (Flask, FastAPI, Django…)
  • Knowledge of containers (Docker Compose, Kubernetes) is a plus
  • Knowledge of reporting tools is a plus (Tableau, Power BI…)
  • Knowledge of streaming technologies is a plus (Kafka)
  • Knowledge of typescript is a plus

Behavioral & Management Skills :

  • Strong analytical thinking, solution-oriented and force of proposal
  • Capacity to navigate in a matrix and international environment 
  • Autonomy, rigorous mindset, commitment
  • Curiosity, interest to challenge
  • Team player

Required Education 

  • Bachelor's degree in computer science, software or computer engineering, applied math, physics, statistics, or a related field or equivalent experience

SCOR, the 4th largest reinsurer in the world, provides insurance companies with a diversified and innovative range of solutions and services to control and manage risk. Leveraging experience and expertise to deliver “The Art & Science of Risk”, SCOR provides cutting-edge financial solutions, analytics tools and services in all areas related to risk – from Life & Health and Property & Casualty insurance to Investments. Our specialized teams operate in over 160 countries, fostering long-term relationships with clients. 

In order to provide our clients with a broad range of innovative reinsurance solutions, SCOR pursues an underwriting policy that is founded on profitability and supported by effective risk management strategy and a prudent investment policy. This approach allows us to offer clients an optimum level of security, to create value for shareholders, and to contribute to the welfare and resilience of society by helping to protect insureds against the risks they face.

At SCOR, we believe that employing people from different backgrounds and ensuring inclusivity is a major driving force for the success of the Group. We are committed to fostering a work environment in which all employees are treated fairly and respectfully, have equal access to opportunities and resources, and can contribute fully to SCOR’s success. 

Read Full Description
Confirmed a day ago. Posted 30+ days ago.

Discover Similar Jobs

Suggested Articles