Salesforce has flagged the Senior Data Engineer job as unavailable. Let’s keep looking.

Employment Type

Permanent

Closing Date

29 June 2024 11:59pm

Job Title

Data Platform Engineering Senior Specialist - Data Engineer - Azure App services Development

Job Summary

As a Data Platform Engineer Senior Specialist, you bridge the hardware-software boundary. You create, optimise and maintain data storages and platforms so that our users can access high quality and reliable data assets and solutions. You play a key role in transforming epics into delivering new capability/product features for our customers.

Job Description

Telstra is Australia's leading telecommunications and technology company with a rich heritage that’s been built over 100 years. From our humble beginnings in the Postmaster General’s Office to the global business we are today our people have been at the forefront of technology innovation. More recently, we have the largest Internet of Things network in Australia and are leading the way in 5G. And this is just the beginning of what we’re hoping to achieve together. 

We offer a full range of services and compete in all telecommunications markets throughout Australia and are the most well-known brand in technology and communications industry. 

We have operations in more than 20 countries, including in India. In India we are a licensed Telecom Service provider (TSP) and have extended our global networks into India with offices in Bangalore, Mumbai and Delhi. We’ve opened an Innovation and Capability Centre (ICC) in Bangalore, and have a presence in Pune and Hyderabad. In India, we’ve set out to build a platform for innovative delivery and engagement that will strengthen our position as an industry leader. We’re combining innovation, automation and technology to solve the world’s biggest technological challenges in areas such as Internet of Things (IoT), 5G, Artificial Intelligence (AI), Machine Learning, and more. 

We’re growing, fast, and for you that means many exciting opportunities to develop your career at Telstra. Join us on this exciting journey, and together, we’ll reimagine the future. 

Being part of Data Engineering means you'll be part of a team that focuses on extending our network superiority to enable the continued execution of our digital strategy. With us, you'll be working with world-leading technology and change the way we do IT to ensure business needs drive priorities, accelerating our digitisation programme. 

We are seeking a highly skilled Data Engineer with expertise in Python, SQL, REST API and exposure to Azure App Services. The successful candidate will be responsible for designing, developing, and maintaining data pipelines using Spark, Python, Scala, and related technologies. The Data Engineer will also be responsible for ensuring data quality, data security, and optimal performance of the data pipelines. Any new engineer would be mostly into developing reusable data processing and storage frameworks that can be used across data platform.

Job Location – Bangalore or Hyderabad.

Key Responsibilities

Senior Data Engineer role is to coordinate, and execute all activities related to the requirements interpretation, design and implementation of Data Analytics applications. This individual will apply proven industry and technology experience as well as communication skills, problem-solving skills, and knowledge of best practices to issues related to design, development, and deployment of mission-critical systems with a focus on quality application development and delivery.

This role is key to the success of the Data Engineering capability at Telstra and will be responsible and accountable for the following:

  • Lead the design, development, and maintenance of data pipelines using Spark, Python, Scala, and related technologies.
  • Work with high volume data and ensure data quality and accuracy
  • Implement data security and privacy best practices to protect sensitive data
  • Collaborate with data scientists and business stakeholders to understand data needs and requirements
  • Develop and maintain documentation on data pipeline architecture, data models, and data workflows
  • Mentor and provide technical guidance to junior team members
  • Monitor and troubleshoot data pipelines to ensure they are performing optimally
  • Stay up-to-date with the latest developments in Azure, AWS, Spark, Python, Scala, and related technologies and apply them to solve business problems
  • Optimize data pipelines for cost and performance
  • Automate data processing tasks and workflows to reduce manual intervention.
  • Ability to work in Agile Feature teams
  • Provide training and educate other team members around core capabilities and helps them deliver high quality solutions and deliverables/documentation
  • Self-Motivator to perform Design / Develop user requirements, test and deploy the changes into production

Technical Skills

  • Hands-on experience on Python, Spark, SQL, REST APIs and Azure Cloud experience.
  • Exposure to Open AI, Azure App insights, Telemetry experience, LangChain, Azure cognitive search/Bing search, Gen AI and Chat boat.
  • Experience of working on File formats (Parquet/ORC/AVRO/Delta/Hudi etc.) 
  • Experience with high volume data processing and data streaming technologies
  • Experience of using Orchestration tools like Control-m,Azure Data Factory, Airflow, Luigi to schedule jobs.
  • Demonstrated experience leading data engineering projects and mentoring junior team members.
  • Data engineer with expertise of working on Azure Cloud using Databricks, Kinesis/ Azure Event hub Flume/Kafka/Spark streaming, Azure Data Factory.
  • Strong experience in data modeling, schema design, and ETL development using SQL and related technologies
  • Familiarity with data security and privacy best practices
  • Good exposure on TDD
  • Exposure on using CI tools like Git, Bitbucket, Github, Gitlab, Azure DevOps
  • Exposure on using CD tools like Jenkins, Bamboo, Azure DevOps
  • Exposure on Observability tools like Azure Monitor, Graphana etc
  • Prior experience in building or working in team building reusable frameworks
  • Good understanding of Data Achitecture and design principles. (Delta/Kappa/Lambda architecture) 
  • Exposure to Code Quality - Static and Dynamic code scans 
  • Experience in designing solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture as well as high-scale or distributed RDBMS and/or knowledge on NoSQL platforms.
  • Should be able to provided scalable and robust solution architecture depending on the business needs.
  • Propose best practices/standards
  • Programming & Databases - Java /Python/Scala/ SQL Procedure. Multi tenanted databases / Spark

Organisational skills 

Proven high level of initiative, drive and enthusiasm – essential. 

Time management and an ability to adhere to set schedules. 

Communication Skills

Demonstrated high level of written and oral communication skills 

Excellence in customer service and consulting skills. 

Strong verbal communication skills. 

Strong listening & questioning skills. 

Conflict Handling 

Ability to work under pressure – essential. 

Ability to manage differences in opinions or solution approaches between business and technology stakeholders. 

Ability to identify and rationalise issues and provide leadership in progressing to a beneficial outcome. 

Problem Solving 

High level of problem solving and interpersonal skills. 

Ability to view problems and potential issues from all perspectives (think out of the box). 

Qualifications/experiences:

Degree level IT qualifications in Software or Systems Engineering. 

Minimum 8+ Years of experience in IT of building and supporting Data Engineering pipeline on Big Data/ Cloud Data platforms. 

Values 

We are changemakers, We are better together, We care, We make it simple 

Read Full Description
Confirmed 20 hours ago. Posted 30+ days ago.

Discover Similar Jobs

Suggested Articles