Who We Are
Axpo is Switzerland’s largest producer of renewable energy and an innovation leader in international energy trading. To support our digital and data-driven ambitions, we are investing in robust, secure, and scalable infrastructure. You will be part of the team that ensures our platforms can operate reliably, securely, and seamlessly across a complex ecosystem of cloud services and enterprise systems.
About the Role
As a Databricks Platform Engineer within the Infrastructure Business Area, you will focus on the secure, scalable, and reliable operation of Axpo’s Databricks platform in the Azure cloud. You’ll take a lead role in automating infrastructure, integrating platform capabilities into broader enterprise services, and ensuring robust networking, storage, and governance alignment. You’ll collaborate with platform, security, and data teams to provide a high-performing and compliant analytics environment.
Your Responsibilities
- Administer and operate Databricks workspaces, jobs, clusters, and Unity Catalog across multiple environments
- Automate infrastructure provisioning and platform operations using Terraform, CI/CD pipelines, and Databricks APIs
- Ensure robust network integration, including secure VNet peering, firewall rules, and private endpoint configurations to protect sensitive data flows
- Support integration of Azure Blob Storage, ensuring efficient, secure data access for both ingestion and analytics use cases
- Contribute to the configuration and troubleshooting of external database connections (e.g., via JDBC) and other downstream system integrations
- Ensure proper access control and identity management using SCIM, AAD, and role-based permissions across workspaces and Unity Catalog
- Support platform observability by implementing and managing telemetry pipelines (Datadog, Grafana, Dragster)
- Collaborate with security teams to ensure compliance with policies and frameworks (e.g., data encryption, audit logging, GDPR)
- Develop foundational knowledge of API Gateway capabilities and support interfacing Databricks workloads with internal or external services
- Contribute to platform documentation, reusable modules, and support runbooks
- Act as a technical reference for platform-related challenges and incidents
What You Bring
- A university degree in Computer Science, IT, or a related technical field
- Proven experience operating Databricks in enterprise or cloud-native environments
- Solid hands-on expertise in Azure networking, including VNet integration, firewall configuration, and network security controls
- Familiarity with integrating Databricks with Azure Blob Storage, external databases, and enterprise APIs
- Working knowledge of Azure API Management or similar gateways, and how APIs can be exposed or consumed securely by Databricks
- Proficiency in Terraform, Azure DevOps, and scripting languages like Python or Shell
- Good understanding of authentication, authorization, and identity federation within cloud platforms
- Awareness of cost monitoring and performance tuning within Databricks and Azure services
- Strong communication skills and a collaborative mindset
- Fluent in English
Technologies You’ll Work With
- Cloud & Infra: Microsoft Azure, Azure VNet, Private Endpoints, Blob Storage, API Management
- Platform: Databricks, Delta Lake, Unity Catalog, Spark
- IaC & Automation: Terraform, Azure DevOps, Bitbucket/GitHub, CI/CD
- Scripting & Integration: Python, REST APIs, JDBC, SCIM
- Observability: Datadog, Grafana, Dragster
- Other: Docker, Linux, Azure Active Directory
Nice to Have
- Experience with multi-workspace governance and account-level Databricks configurations
- Familiarity with containerized workloads (e.g., AKS, Docker)
- Exposure to regulated environments (e.g., energy, finance)
- Knowledge of orchestration tools such as Airflow or Dagster
Read Full Description