Job Specifications
Senior / Data Engineer - Financial Services / FinTech Platform - Data Infrastructure
Mid-Senior
London, City
Hybrid working - 3 days a week in City office, 2 WFH
Talensa are partnered with a specialist B2B Financial Services Tech platform firm to hire an experienced Data Engineer (Mid-Senior) to take end-to-end ownership of a modern data platform built on Azure Databricks, Kafka and with BI-layer.
This person would be the primary Data Engineer, maintaining and further developing a well-established cloud and CI/CD setup to implement and evolve reliable, well-tested data pipelines in Spark/Scala and to extend the dimensional data models.
You will be part of the technical team collaborating closely with platform engineering, data science, and business stakeholders to further develop the synergy between the internal software platform and the Databricks data platform.
This role will suits a hands-on Data engineer who is looking to work on interesting Tech, solve data problems, while being a key member of the team, comfortable working independently day to day, following agreed designs, patterns, and standards.
What you’ll do
Own and maintain data pipelines ingesting from various sources into Databricks
Collaborate closely with software engineers to evolve our software platform and address technical challenges
Design, develop, and optimize Spark/Scala jobs for ingestion and transformation, using the existing GitHub Actions and Terraform setup for deployments
Implement, run and monitor dbt tests to ensure data quality and consistent dimensional models
Extend and maintain Delta Lake fact and dimension tables for analytics use cases, following a bronze/silver/gold-style data architecture
Work closely with analysts, software engineers, and business users using Sigma to ensure they have trusted, well-documented datasets
Monitor, troubleshoot, and improve pipeline reliability, performance, and cost within the existing observability framework
Contribute to documentation of pipelines, datasets, and data contracts, acting as the main point of contact for data engineering topics
Investigate and resolve data quality incidents and pipeline failures
Proactively drive adoption of Databricks within the team and experiment with new Databricks features and tools where they bring clear value, including integrating AI tools for analysis and AI coding agents into day-to-day workflows
Tech stack
Azure Databricks (Spark, Delta Lake)
SQL Server, Confluent Kafka
Scala, SQL, dbt
Git, GitHub Actions, Terraform
Sigma (BI)
Must-have experience
Commercial experience as a Data Engineer working with Spark in production (ideally on Databricks)
Essential experience working in finance, where you have developed a strong command of financial terminology, concepts, and stakeholder needs to collaborate effectively
Good understanding of modern software platform architecture and the role of the data platform within the broader platform ecosystem
Good Scala skills (or strong PySpark plus a genuine willingness to work primarily in Scala)
Strong SQL skills for transformations, data modelling, and debugging data issues
Experience building and supporting ingestion pipelines end to end
Understanding of data modelling for analytics (star schema, facts/dimensions) and how schema changes affect downstream users
Exposure to layered data architectures (e.g. bronze/silver/gold) and basic data governance practices (permissions, documentation, data ownership
Experience working in a CI/CD environment and with infra-as-code tools (e.g. GitHub Actions and Terraform on Azure)
Track record of writing well-tested, maintainable data pipelines and jobs (unit/integration tests and data quality checks)
Ability to work as a self-starter, taking ownership of the data platform, making pragmatic decisions, and driving work to completion with limited day-to-day supervision
Nice to have
Experience with Azure data services beyond Databricks (e.g. Key Vault, Storage).
Experience with modern BI tools (e.g. Sigma, Looker, Power BI, Tableau).
Experience working with AI coding agents (like Claude Code, Cursor, Copilot, etc) and writing developed thoughtful prompts
Interest in analytics and data science, and working with others to turn data into insight
What’s on offer:
Competitive Salary
Annual Bonus and participation in Growth Share Scheme
Pension
Medical and Critical illness Insurance
Generous Holiday days
Opportunity to work in a specialist finance business with a great team
*To note on scope*
This is the only senior / experienced data engineer role in the Engineering Tech team with no immediate plans to hire underneath. Would suit someone who wants to work on interesting Tech & Data problems, platform innovations and support driving wider business growth.
About the Company
Welcome to Talensa Partners
Your dedicated search partner for technology and transformation leadership talent that tackle the business problem, build on culture and enable the next phase of innovation and growth.
Our focus lies exclusively curating and promoting a network of diverse, innovative high impact leaders and technical talent across Technology, Transformation, Data & AI, Product, Platform and Cyber Security.
We offer a suite of easy to engage cost effective services to identify and onboard the right talent sol...
Know more