Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Databricks Data Engineer, hibrido.
Spain Jobs Expertini

Urgent! Databricks Data Engineer, hibrido Job Opening In Madrid – Now Hiring Axpo Group

Databricks Data Engineer, hibrido



Job description

Databricks Data Engineer
Who We Are

Axpo is driven by a single purpose - to enable a sustainable future through innovative energy solutions.

As Switzerlands largest producer of renewable energy and a leading international energy trader, Axpo leverages cutting-edge technologies to serve customers in over 30 countries.

We thrive on collaboration, innovation, and a passion for driving impactful change.



About the Team

You will report directly to our Head of Development and join a team of highly committed IT data platform engineers with a shared goal: unlocking data and enabling self-service data analytics capabilities across Axpo.

Our decentralized approach means close collaboration with various business hubs across Europe, ensuring local needs shape our global platform.

Youll find a mindset committed to innovation, collaboration, and excellence.



What You Will Do

As a Databricks Data Engineer, you will:
- Be a core contributor in Axpos data transformation journey by using Databricks as our primary data and analytics platform.


- Design, develop, and operate scalable data pipelines on Databricks, integrating data from a wide variety of sources (structured, semi-structured, unstructured).


- Leverage Apache Spark, Delta Lake, and Unity Catalog to ensure high-quality, secure, and reliable data operations.


- Apply best practices in CI/CD, DevOps, orchestration (e.g., Dragster, Airflow), and infrastructure-as-code (Terraform).


- Build re-usable frameworks and libraries to accelerate ingestion, transformation, and data serving across the business.


- Work closely with data scientists, analysts, and product teams to create performant and cost-efficient analytics solutions.


- Drive the adoption of Databricks Lakehouse architecture and help standardize data governance, access policies, and documentation.


- Ensure compliance with data privacy and protection standards (e.g., GDPR).


- Actively contribute to the continuous improvement of our platform in terms of scalability, performance, and usability.



What You Bring Who You Are

Were looking for someone with:
- A university degree in Computer Science, Data Engineering, Information Systems, or a related field.


- Strong experience with Databricks, Spark, Delta Lake, and SQL/Scala/Python.


- Proficiency in dbt, ideally with experience integrating it into Databricks workflows.


- Familiarity with Azure cloud services (Data Lake, Blob Storage, Synapse, etc.).


- Hands-on experience with Git-based workflows, CI/CD pipelines, and data orchestration tools like Dragster and Airflow.


- Deep understanding of data modeling, streaming batch processing, and cost-efficient architecture.


- Ability to work with high-volume, heterogeneous data and APIs in production-grade environments.


- Knowledge of data governance frameworks, metadata management, and observability in modern data stacks.


- Strong interpersonal and communication skills, with a collaborative, solution-oriented mindset.


- Fluency in English.



Technologies Youll Work With
- Core: Databricks, Spark, Delta Lake, Python, dbt, SQL
- Cloud: Microsoft Azure (Data Lake, Synapse, Storage)
- DevOps: Bitbucket/GitHub, Azure DevOps, CI/CD, Terraform
- Orchestration Observability: Dragster, Airflow, Grafana, Datadog, New Relic
- Visualization: Power BI
- Other: Confluence, Docker, Linux

Nice to Have
- Experience with Unity Catalog and Databricks Governance Frameworks
- Exposure to Machine Learning workflows on Databricks (e.g., MLflow)
- Knowledge of Microsoft Fabric or Snowflake
- Experience with low-code analytics tools like Dataiku
- Familiarity with PostgreSQL or MongoDB
- Front-end development skills (e.g., for data product interfaces)

Department Installation / Maintenance / Servicing / Craft Locations Madrid Remote status Hybrid
Spark, Databricks, Python, SQL, Scala


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Databricks Data Potential: Insight & Career Growth Guide