Nokia is a global leader in the technologies that connect people and things. With state-of-the-art software, hardware and services for any type of network, Nokia is uniquely positioned to help communication service providers, governments, and large enterprises deliver on the promise of 5G, the Cloud and the Internet of Things.
Serving customers in over 100 countries, our research scientists and engineers continue to invent and accelerate new technologies that will increasingly transform the way people and things communicate and connect.
Nokia is undertaking a large-scale service analytics and big data platform initiative called AVA and serving thousands to tens of thousands of internal and external users. This platform will enable Nokia and its customers to build the next generation of services for telecommunications and beyond, including transport, connected health and broader IOT. You will be joining a very dynamic and growing team of talented individuals, setting up state-of-the-art solutions for outstanding customer experience.
The Data Engineer is responsible for the development of analytics big data transformation flows on a large-scale service analytics platform for various customers across the globe. Responsibility includes delivery of high quality use case Data Pipeline ready for use and deployment towards customers, meeting the business requirements and aligning with the solution vision and strategy.
As the primary interface of the Solution Owner, the Data Engineer will report to the Head of Data Engineering. The Data Engineer will have a strong influence on the technical design, and be expected to have a deep understanding of business and solution development context.
Head of Data Engineering Responsibilities and Duties
- Understand the customer’s business context, objectives and requirements.
- Involvement in Requirement Analysis, Design, Development, Unit Testing, System Integration Testing and other facets of testing for example but not limited to Performance Testing :
- Convert functional specifications from business requirements into programming instructions for technical development of analytics data pipelines.
- Break down major requirements in to small incremental value-add features and prioritize with Solution Owner.
- Prototype creative analytics data pipeline mock-ups, and be able to collaborate with others in crafting and implementing your technical vision.
- Develop industrialized solutions leveraging Agile and DevOps methodologies.
- Review, analyze, and modify programming systems, including encoding, testing, debugging and installing for a large-scale environment.
- Monitor operating efficiency and optimize solution execution performance.
- Support Solution Owner through the delivery process to customers :
- Execute the analytics solution development plan, resolves or escalates problems timely.
- Foster the Analytics Competence Center in developing enablement and e2e solution consultancy across Use Cases.
- Work harmoniously in a large cross-functional team including managers, supervisors, business analysts, systems personnel, network staff, and other developers.
- Bachelor’s Degree in Computer Science, Software Development, or a business-related field.
- 3 to 5 years’ experience in a comparable role.
- Fully conversant with analytics and new technologies, strong knowledge of industry trends knowledge including products and services on Nokia’s core business.
- Strong experience in building fast, robust, effective and efficient analytics data pipelines with modern cloud development technologies covering data collection, transformation, storage and exposure :
- Advanced development skills and experience in Scala language (or having a particularly deep expertise in one of the field below) is a must.
- Big data distributed environment development and optimization experience with Spark.
- NoSQL technology experience with Apache Cassandra or Parquet Files hosted on S3, designing relevant data models dealing with high data velocity and/or volume while ensuring integrity.
- Streamlined data collection and transformation flow experience with Kafka considered a plus.
- Experience consuming enterprise web-services (REST, JSON/XML, MySQL/PostgreSQL) and also expose own services to help design the next generation of back-end APIs and functionality.
- Experience in containerization/ dockerization of developed solutions.
- Experience with Apache NiFi considered a plus.
- Understanding of Java Development tools (IntelliJ IDEA)… Zeppelin notebook considered a plus.
- Modern software development methodologies such as Agile, Scrum, etc… including Test Driven Design and other testing methodologies. Use Git to manage source code and Artifactory to publish.
- Sound methodical skills with attention to detail and process requirements.
- Capability to multi-task and prioritize to ensure timely deliveries.
- Comfortable with working with multiple stakeholders in a multi-cultural environment of a global matrix organization with sensitivity and partnering.
- High energy, initiative, enthusiasm and persistence. English mandatory.