Back to job search

Lead Data Engineer

  • Location:

    Glasgow

  • Sector:

    Data & AI

  • Job type:

    Permanent

  • Salary:

    £70000 - £85000 per annum

  • Contact:

    Eve Fraser

  • Contact email:

    efraser@headresourcing.com

  • Job ref:

    BBBH37908_1771326790

  • Published:

    about 4 hours ago

  • Expiry date:

    2026-03-19

Lead Data Engineer (Azure / Databricks)

NO VISA REQUIREMENTS


MUST BE BASED NEAR GLASGOW TO WORK 3 DAYS ONSITE


My FMCG client is undergoing a major transformation of their entire data landscape-migrating from legacy systems and manual reporting into a modern Azure + Databricks Lakehouse. They are building a secure, automated, enterprise-grade platform powered by Lakeflow Declarative Pipelines, Unity Catalog and Azure Data Factory.
They are looking for a Lead Data Engineer to help deliver high-quality pipelines and curated datasets used across Finance, Operations, Sales, Customer Care and Logistics.


What You'll Do

Lakehouse Engineering (Azure + Databricks)


  • Build and maintain scalable ELT pipelines using Lakeflow Declarative Pipelines, PySpark and Spark SQL.

  • Work within a Medallion architecture (Bronze ? Silver ? Gold) to deliver reliable, high-quality datasets.

  • Ingest data from multiple sources including ChargeBee, legacy operational files, SharePoint, SFTP, SQL, REST and GraphQL APIs using Azure Data Factory and metadata-driven patterns.

  • Apply data quality and validation rules using Lakeflow Declarative Pipelines expectations.

Curated Layers & Data Modelling


  • Develop clean and conforming Silver & Gold layers aligned to enterprise subject areas.

  • Contribute to dimensional modelling (star schemas), harmonisation logic, SCDs and business marts powering Power BI datasets.

  • Apply governance, lineage and permissioning through Unity Catalog.

Orchestration & Observability


  • Use Lakeflow Workflows and ADF to orchestrate and optimise ingestion, transformation and scheduled jobs.

  • Help implement monitoring, alerting, SLAs/SLIs and runbooks to support production reliability.

  • Assist in performance tuning and cost optimisation.

DevOps & Platform Engineering


  • Contribute to CI/CD pipelines in Azure DevOps to automate deployment of notebooks, Lakeflow Declarative Pipelines, SQL models and ADF assets.

  • Support secure deployment patterns using private endpoints, managed identities and Key Vault.

  • Participate in code reviews and help improve engineering practices.

Collaboration & Delivery


  • Work with BI and Analytics teams to deliver curated datasets that power dashboards across the business.

  • Contribute to architectural discussions and the ongoing data platform roadmap.

Tech You'll Use


  • Databricks: Lakeflow Declarative Pipelines, Lakeflow Workflows, Unity Catalog, Delta Lake

  • Azure: ADLS Gen2, Data Factory, Event Hubs (optional), Key Vault, private endpoints

  • Languages: PySpark, Spark SQL, Python, Git

  • DevOps: Azure DevOps Repos & Pipelines, CI/CD

  • Analytics: Power BI, Fabric

What We're Looking For

Experience


  • Commercial and proven Lead Data Engineering experience.

  • Hands-on experience delivering solutions on Azure + Databricks.

  • Strong PySpark and Spark SQL skills within distributed compute environments.

  • Experience working in a Lakehouse/Medallion architecture with Delta Lake.

  • Understanding of dimensional modelling (Kimball), including SCD Type 1/2.

  • Exposure to operational concepts such as monitoring, retries, idempotency and backfills.

Mindset

  • Good energy and enthusiasm
  • Keen to grow within a modern Azure Data Platform environment.
  • Comfortable with Git, CI/CD and modern engineering workflows.
  • Able to communicate technical concepts clearly to non-technical stakeholders.
  • Quality-driven, collaborative and proactive.


Why Join?


  • Opportunity to shape and build a modern enterprise Lakehouse platform.

  • Hands-on work with Azure, Databricks and leading-edge engineering practices.

  • Real progression opportunities within a growing data function.

  • Direct impact across multiple business domains.

If this job isn't the one for you, then don't worry we have lots more opportunities available!