cover image
Mobysoft

Senior Data Engineer

On site

Manchester, United kingdom

Senior

Freelance

23-10-2025

Share this job:

Skills

Python SQL Data Governance Data Engineering Apache Airflow Encryption Cloud Security GitHub CI/CD Monitoring AWS Lambda Programming apache git AWS AWS Cloud Artificial Intelligence Terraform Infrastructure as Code GitHub Actions

Job Specifications

Senior Data Engineer

Location: Manchester (Hybrid)

Salary: Competitive plus excellent benefits

Start: ASAP

Ideal skills: Data Engineer, MLOps, AI, Smart Data, AWS Certified, Data Governance, Data Transformation, Data Modelling

Who We Are

Founded in 2003, Mobysoft provides data-based insight solutions to a wide range of social housing clients, supplying technology to help landlords to improve their income-collection processes for the good of all involved. Mobysoft delivers two market-leading products, which help keep tenants housed in a home they can enjoy and simultaneously improves rent collection for the long term good of the organisation.

Our vision is working towards a world in which intelligent technology significantly improves the quality of life for people who live in social housing and our mission is delivering accurate actionable data insights that help social housing providers to deliver a more consistent and equitable service

What are we looking for?

We are an ambitious, customer-centric Data & Insights team, dedicated to developing a new generation of data products that unlock significant value for the social housing sector. We operate with a focused product lens, driven by curiosity and a commitment to technical excellence. Data Engineering is foundational to this mission, ensuring synchronized, curated, trusted, dynamic data is available at speed and scale for our clients, data analysts, data scientists, and stakeholders. As such we are looking for a most excellent Senior Data Engineer (SDE) professional to join our dynamic team, working with our most excellent existing Senior Data Engineer, to help us deliver.

Key Responsibilities

Ingesting and understanding new data sources, evolving our data approach (e.g., to a fully-fledged data fabric), implementing and streamline MLOps processes and exploring how we can sensibly and safely utilise artificial intelligence (AI) to turbo charge our work (e.g., smart data quality monitoring, meta data curation and service optimisation).

In short there will be plenty to keep you interested, your technical and power skills developing and a real chance to drive innovation and change.

As part of this you may also be asked to liaise with our clients (directly or through events, conferences, webinars etc.), to help support their data driven journeys.

Why should you consider this role?

Alongside the work opportunities as described your development will be supported, you will be sensibly renumerated, we will provide a compelling benefits package, and we are a great bunch of folks to work with - though we would say that!

Qualifications And Skills

Required

An honours degree in Information Systems, Computer Science, Information Technology, Software Engineering or similarly related and quantitative discipline.
AWS certification - AWS Certified Data Engineer, Associate certification.
4-5+ years of commercial experience working primarily in an AWS Cloud environment using approaches/tooling like ours (see technical skills), delivering scalable, performant, reliable solutions.
Strong data reliability/observability, data governance and information security credentials.

What technical skills are required?

ETL/ELT & Data Transformation

Amazon Redshift (query tuning, distribution/sort keys, workload management)
Data modelling (normalisation, dimensional)
dbt (modeling, testing, documentation, deployment)
Building scalable ETL/ELT pipelines with Python

Workflow Orchestration

Apache Airflow (DAG design, scheduling, monitoring, scaling)
Best practices for dependency management, retries, and alerting

Cloud & Serverless

AWS Lambda (Python-based serverless pipelines, event-driven processing)
IAM roles, policies, and security best practices

Programming & Scripting

Python (data processing, automation, testing)
SQL (advanced query writing and optimization)

Data Engineering Best Practices

CI/CD for data pipelines (Git, GitHub Actions, etc.)
Data quality checks, monitoring, and observability
Infrastructure as Code (Terraform etc.)

Other Tools & Ecosystem

Experience with logging/monitoring
Exposure to data governance, cataloguing, and lineage tools
Ability to work with a range of structured, semi-structured and unstructured file formats including Parquet, json, csv, xml, pdf, jpg.
Tools and methods to develop comprehensive data reliability and active metadata solutions.
Ability to work with and develop APIs (including data transformations).
Ability to deliver data deidentification and anonymisation solutions.

Understanding of Cloud security frameworks (specifically on the AWS Cloud) including appropriate data encryption

Desirable

Utilising AI within data engineering to drive performance
Facilitating search tools such as Solr
MLOps experience including familiarity with tools such as DVC & mlflow
Full data engineering cycle knowledge (tools and skills) for stream data

The Person

You Are Someone Who

Takes ownership and thrives on improving how things are

About the Company

Founded in 2003, Mobysoft is a trusted provider of data-driven solutions tailored for the social housing sector. Our platforms are built on cutting edge Artificial Intelligence technology and employ Machine Learning to empower social landlords with a suite of systems designed to enhance income-collection and repairs processes, benefiting tenants and organisations alike. Our key solutions include RentSense, RepairSense, Intelligent Automation, and FTA Essentials, each addressing critical aspects of the social housing industry... Know more