
Build and scale the global data backbone for AI and workforce intelligence
Are you a Data Engineer who thrives in a modern Databricks environment and enjoys building scalable, cloud-native data platforms? Do you want to work at the intersection of data engineering, AI enablement and platform architecture in an international organisation?
We are looking for a Data Engineer to join the Global Data & AI Team of HeadFirst & Impellam Group. In this role, you will help design, build and evolve our global Headless Data Architecture (HDA), the foundation for analytics, AI agents and intelligent workforce solutions across Europe, North America and APAC.
Your impact
As a Data Engineer, you play a key role in shaping and scaling the data platform that powers our global Workforce-as-a-Service (WaaS) strategy. You work on our Headless Data Architecture (HDA) a scalable 9-layer, API-first lakehouse architecture built on Azure Databricks, Unity Catalog and Delta Lake.
You are responsible for building reliable, governed and production-ready data pipelines across our global lakehouse ecosystem. From medallion-layer engineering to data contracts and platform observability, you help ensure that data products are scalable, reusable and AI-ready.
Working in a highly collaborative international environment, you contribute directly to the evolution of our global data platform and support advanced analytics, semantic search and AI workloads used across the business.
A key part of your role is combining strong engineering fundamentals with an automation-first mindset. You work with modern AI-assisted development tooling and help build a platform where data engineering, governance and AI capabilities are fully integrated.
You collaborate closely with ML/AI Engineers, Platform Engineers, AI Builders, Cloud Engineers, and stakeholders across regions to continuously improve the platform’s scalability, reliability, and quality.
At the same time, you contribute to building a future-proof and governed data ecosystem embedding data quality, lineage, security and compliance standards into every layer of the platform.
What you will do
Design, build and maintain Bronze → Silver → Gold medallion pipelines on Azure Databricks
Develop deployable and version-controlled data products using Lakeflow, Spark Declarative Pipelines and Databricks Asset Bundles (DABs)
Implement and maintain scalable data contracts and data product patterns across the platform
Manage and optimise Unity Catalog governance, lineage and access control
Build and maintain vector and relational storage capabilities supporting AI and semantic search workloads
Contribute to Databricks Vector Search and AI-enabled retrieval capabilities
Collaborate with global teams across Europe, the UK, the US and APAC on cross-regional data exchange and platform scalability
Participate in Git-based development workflows, code reviews and CI/CD practices
Monitor pipeline health, observability and platform reliability from a DataOps perspective
Work with AI-assisted development tooling such as Claude Code and the Databricks AI Dev Kit
About the role
You will be part of a growing Global Data & AI Team that is central to the transformation of HeadFirst & Impellam Group into a truly AI-enabled and data-driven organisation.
Working with technologies such as Azure Databricks, Unity Catalog, Delta Lake and Lakeflow, you help build and scale a shared global data platform that supports analytics, AI products and operational intelligence across multiple regions.
This role offers a strong combination of hands-on data engineering, platform ownership and exposure to modern AI-enabled data architectures.
What you bring
You are a hands-on Data Engineer with a strong technical foundation and a pragmatic mindset for building scalable, governed and production-ready data platforms.
3+ years of hands-on experience with Databricks in a production environment
Strong experience with Medallion Architecture (Bronze / Silver / Gold)
Solid understanding of Apache Spark and Delta Lake concepts, including optimisation, partitioning and ACID transactions
Experience with Databricks SQL and Unity Catalog governance
Familiarity with Lakeflow Connect, Spark Declarative Pipelines and Databricks Asset Bundles (DABs)
Strong SQL and Python skills for large-scale data engineering workloads
Experience with Git-based development workflows and CI/CD practices
Understanding of data contracts, versioning, and scalable data product design.
Experience with monitoring, observability and DataOps practices
Familiarity with vector search, semantic search or AI-oriented data workloads is a plus.
Experience working in a global, multi-lakehouse Databricks setup — ideally hub-and-spoke — with Delta Sharing across regions.
Familiarity with Azure Synapse Analytics - you will work with Synapse as part of the existing stack while the platform migrates to Databricks-native tooling (Lakeflow, DABs).
Hands-on experience with SnapLogic — used for integration flows across the organisation outside Databricks, including connectors, pipelines, and the SnapLogic Designer.
What you offer as a professional
Strong communication skills able to explain technical concepts and trade-offs clearly
A structured and quality-driven approach to engineering and documentation
A collaborative mindset, working effectively across global technical and business teams
A pragmatic approach to building scalable and maintainable data solutions
Curiosity about AI-native engineering and modern platform development
A strong drive to continuously learn and contribute to an evolving architecture
Sollicitatie indienen
We nemen binnen 48 uur contact met jou op
Op gesprek
Assessment
Op gesprek
Aanbod
Welkom bij HeadFirst Group!
Én de mogelijkheid tot bij- of verkopen van dagen. Wij werken hard, maar vergeten niet te ontspannen.
Met meer dan 450 collega's hebben wij altijd iets te vieren. Als één team staan wij stil bij verjaardagen, jubilea en andere successen!
Werk aan je mentale gezondheid met toegang tot het platform OpenUp. Wij leren elke dag en daarom kun je bij ons gratis opleidingen en cursussen volgen om jezelf te blijven ontwikkelen.
Je doelen behalen en dit terugzien in de vorm van een bonus? Dat kan bij ons! Hard werken wordt beloond.
Werk op één van onze 4 locaties, thuis, of op elke andere plek waar jij je prettig voelt. Uiteraard ontvang jij een mobiliteits- en thuiswerkvergoeding vanuit ons.
Maak gebruik van de gratis sportfaciliteiten, fijne werkplekken en geniet van een heerlijk lunchbuffet. Een geduchte concurrent van thuis!
Mattijs Wassenburg
Managing Director
Christine Koekkoek
Business Analist
Voor deze opdracht dien je een bieding te plaatsen op Striive. Striive is het grootste opdrachtenplatform van de Benelux waar jaarlijks meer dan 20.000 opdrachten gepubliceerd worden.