about the role
As part of the Big Data B2B program, OBS set up a shared Big Data platform and a Data Lake for Use cases exploration and industrialization.
We are looking for Senior Data Engineer with 4-6 years of experience in building Data Pipelines on Prem and Cloud with below KRA(S):
- Automate, Industrialize the build and development tasks.
- Lead discussion sessions with stakeholders.
- Participate in all areas of the data Engineering life-cycle and lead the team in requirements gathering and data mapping, systems design, data ingestion development, preparing data mapping documentation, testing and deployment, post implementation support and monitoring.
- Resolve and troubleshoot problems and complex issues.
- Provide innovative solutions to complex business problems.
- Report to and work closely with project teams and Business Analysis team on project delivery status.
- Prepare progress update and status report.
- Provide operational support, ongoing maintenance and enhancement after implementation as part of Run Management Activities.
- Implementing Best Data Integration Practices.
about you
- Good understanding of Big Data Ecosystem with frameworks like HADOOP, SPARK.
- Good experience handling large volume data as well as both structured/unstructured data in Streaming and Batch modes.
- High Coding proficiency in at least one modern programming language: Python, Java or Scala.
- Hands on Experience on NiFi , Hive, SQL/HQL, Spark sql,Spark Steaming, Oozie, Airflow.
- Good Understanding of Data Integration Patterns.
- Good understanding of KAFKA, Rabbit MQ, AIR FLOW.
- Good understanding of API concepts : REST and also microservices architecture
- Experience of Devops tooling: Jenkins, Maven, GitLab, SonarQube, Docker.
- Good Understanding of Devops Concepts and various technologies like Kubernetes, Dockers, Containers.
- Good understanding of ELK stack .
- Good Understanding of Monitoring tools like Prometheus, Grafanna etc.
- Good Understanding of Cloud architecture is must.
- Good to be professionally certified in any of the Hyperscalers especially GCP. Full Understanding of Compute, Network and Storage Services of GCP.
- Proficiency in GCP services such as BigQuery, Dataflow, Pub/Sub, Dataproc, and Bigtable.
- Good Understanding of Linux and Shell Scripting
- Good Experience of AGILE methods (Scrum, Kanban)
- Good understanding of JIRA.
- Understanding of Data Modelling is value addition.
- Understanding of Open Digital Architecture and TMF principles is preferrable.
- Understanding of Tools Like DSS, Jupyter Notebook is preferrable.
additional information
- Passion for software development, using tools and methods.
- Appetence for Agile / Devops approach.
- Appetite for working in multidisciplinary teams.
- Pedagogy and appetite for mutual support.
- Curiosity, appetence for challenge and innovation.
- Ability to bring forward proposals and ideas.
- Pragmatism, autonomy, organizational skills.
department
Chief Technology Info Office
Orange Business manages and integrates the complexity of international communications, freeing our customers to focus on the strategic initiatives that drive their business. Our extensive experience and knowledge in global communication solutions, together with our understanding of multinational business and local support in 166 countries and territories, ensure that our customers receive a consistent, global solution wherever they do business
contract
Regular
Read Full Description