cover image
Agilisys

Lead Data Engineer

Remote

United kingdom

Senior

Freelance

06-11-2025

Share this job:

Skills

Leadership Python SQL Data Engineering CI/CD DevOps Version Control Azure Data Factory Azure Synapse Analytics Architecture Data Architecture git power bi Azure Agile Analytics Spark

Job Specifications

Title: Lead Data Engineer (6 month contract)

Location: Remote (needs to be based in the UK)

Division: AI Practice

Reports to: Chief Architect - Data and Insights

Clearance: Active Security Clearance

About Us

Agilisys is at the forefront of digital transformation and innovation in the public services sector. With over two decades of experience, we have established ourselves as a trusted partner for governments, local authorities, and organizations nationwide. Our mission is to empower our clients to deliver exceptional public services by harnessing the full potential of technology and data.

  

OUR VALUES

Partnership: we become one team and family with organisations, helping them to navigate change and stay agile.

Integrity: our people really care, going beyond the brief to make change happen for organisations and citizens.

Innovation: we bring together the right technologies and services to design solutions that work.

Passion: we are passionate about - and dedicated to - public services and improving people’s lives.

THE ROLE

We are seeking a Lead Data Engineer with active Security Clearance to be the technical cornerstone of our data engineering team, delivering a high-profile programme of work for a major Public Sector client. This is a hands-on, deeply technical role where you’ll lead from the front — setting the technical direction, shaping architectural decisions, and building modern data solutions using Microsoft Fabric and the Azure data platform.

You’ll play a key role in defining how we deliver data engineering at Agilisys — not just for this engagement, but as part of the ongoing evolution of our Data & AI Practice. Working alongside architects, analysts, and delivery teams, you’ll help us build scalable, secure, and future-ready data platforms and help define best practices for modern data engineering across the organisation.

Key Responsibilities

You will be responsible for shaping our data architecture and advising on industry best practices, while actively building and optimising our clients’ data platforms. Your mission is to mentor a talented team, champion a culture of automation and DevOps, and work collaboratively to design and deliver secure, scalable data solutions that meet our mission-critical needs.

Data Platform Strategy and Technical Leadership

Lead the design and evolution of scalable, modern data platforms using Microsoft Fabric, Azure Synapse Analytics, and Azure Data Factory, with the ability to bridge legacy systems and cloud-native architectures.
Provide strategic technical direction, shaping engineering standards, reusable patterns, and architectural decisions that support performance, integrity, and long-term platform sustainability across hybrid environments.
Work across modern cloud-native data platforms, with a strong understanding of distributed data processing, storage, and compute services within the Azure ecosystem.

Engineering Delivery & Development Best Practice

Lead the hands-on development of end-to-end data solutions using Microsoft Fabric components such as Lakehouses, Warehouses, Notebooks, and Dataflows.
Design scalable data models and ingestion frameworks, supporting both real-time and batch workloads using best-practice techniques (e.g. 3NF, Kimball).
Enable data consumption through well-structured APIs, semantic models, and data products, with a strong understanding of Power BI integration within Fabric.
Drive engineering excellence through robust development practices including version control, automated testing, peer reviews, and documentation.
Optimise performance and cost-efficiency, tuning queries, storage formats (e.g. Delta, Parquet), and compute resources across Fabric and Azure workloads.

Expertise in ETL/ELT and Data Processing

We're looking for an expert in building dynamic and efficient data pipelines. This includes demonstrable experience in:

Designing and implementing scalable ETL/ELT pipelines for both batch and real-time data processing and orchestrating complex data workflows using tools like Azure Data Factory, Fabric Data Pipelines, or equivalent.
Apply foundational data engineering principles to design scalable and maintainable solutions, including schema design, indexing, and query optimisation, while implementing effective strategies for evolving data sources and historical tracking through CDC and SCD techniques
Writing highly optimised transformation logic using Python, SQL, and Spark for large-scale data workloads.
Platform Automation & DevOps
Lead CI/CD implementation for Microsoft Fabric components, integrating Git for version control and automating multi-environment deployments using Fabric Deployment Pipelines.
Embed automated testing across pipelines and transformations, including unit tests, data quality checks, and schema validation.
Manage secure configuration using Fabric Environments and Azure Key Vault for secrets and environment-specific parameters.
Data Security, Governance & Observability
Ensure d

About the Company

With 25 years of experience and a team of over 1,000 skilled professionals across the UK, we are a trusted leader in delivering data-driven solutions and Generative AI products tailored to the needs of healthcare, local government, and other public sector organisations. Through our extensive knowledge of advanced analytics, automation, Generative AI, and cloud IT services, we enable our clients to empower citizens and enrich their lives. As part of Blenheim Chalcot, the UK's fastest-growing digital venture builder, our ambi... Know more