Area(s) of responsibility
Job Title: Data Engineer
Project Overview:
The role involves building a data platform using Databricks on Azure to support transaction processing impact zones. The data products developed will become the standardized method for downstream system data consumption.
- Build and support data pipelines on Azure Databricks.
- Optimize data pipelines and orchestrate workflows efficiently.
- Collaborate within an agile team to deliver enterprise-level data products.
Experience Level:
Level 3 (Overall 7+ years, Relevant 5+ years)
Qualifications:
- Bachelor’s degree or equivalent.
Must-Have Skills (5+ years of experience required):
- Data Engineering – Data pipeline development using Azure Databricks.
- Optimizing data processing performance, efficient resource utilization, and execution time.
- Workflow orchestration.
- Databricks features – Databricks SQL, Delta Lake, and Workflows for complex data workflows.
- Data modeling.
Nice-to-Have Skills:
- Knowledge of PySpark.
- Good understanding of data warehousing.
Tasks & Responsibilities:
- Build and support data pipelines on Azure Databricks.
- Write optimized data pipelines and orchestrate workflows efficiently.
- Work within an Agile team, attend Agile ceremonies, and collaborate to deliver efficient solutions.
Read Full Description