cover image
McCabe & Barton

McCabe & Barton

www.mccabebarton.com

11 Jobs

20 Employees

About the Company

McCabe & Barton are part of the Holley Holland Group.
Having built our reputation on successfully delivering cross-functional results within Technology & Business, Change and Transformation, for 20 years, we are respected and regarded as an industry leader in recruitment research and execution.

By developing long term relationships with our clients, acting as an extension of their business, identifying and engaging suitable candidates on their behalf, we have established ourselves as a trusted partner.

We are a member of APSCO and have recently been successfully audited.

Listed Jobs

Company background Company brand
Company Name
McCabe & Barton
Job Title
Head of Engineering
Job Description
Job title: Head of Engineering Role summary: Lead and inspire geographically dispersed development and test teams for a financial services client, driving delivery of technology solutions through Agile and DevOps practices. Steward partnership with external vendors, prioritize portfolio, and shape culture and engineering excellence. Expectations: * Deliver high‑quality, cost‑efficient technology solutions aligned with business strategy. * Build and sustain inclusive, performance‑driven culture across multiple time‑zones. * Develop partner ecosystem for augmentation and innovation. * Evolve engineering processes to continuously improve productivity and quality. Key responsibilities: * Manage and mentor senior engineers, developers, and QA staff. * Set team OKRs, conduct performance reviews, and support career progression. * Oversight of outsourced, contract, and hybrid teams through clear SLAs and governance. * Plan and execute portfolio delivery, aligning backlog with business priorities and inter‑dependencies. * Select and enforce Agile or Waterfall workflows, maintain IT policy compliance. * Design application architecture – Azure Full Stack, Power Platform and other cloud‑native environments. * Lead continuous improvement initiatives: new tools, automation, quality gates, processes. * Foster stakeholder relationships to translate business needs into technical solutions. * Champion diversity, inclusion, and transformational culture within engineering. Required skills: * Proven progression from developer to technical leadership in software engineering. * Extensive hands‑on expertise in Azure Full, Azure AD, Power Platform, and cloud‑native architecture. * Deep knowledge of Agile & DevOps methodologies; experience scaling practices across continents. * Strong partnership management with vendors, contractors, and internal stakeholders. * Excellent communication, mentoring, and conflict resolution. * Analytical, problem‑solving mindset with curiosity to add business value. * Experience in FinTech, Asset Management, or Financial Services domain. Required education & certifications: * Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. * Microsoft Development and/or Azure certifications (e.g., AZ‑900, AZ‑303/304, DP-100). * Agile certifications (e.g., Certified Scrum Master, Certified Scrum Master – Professional or similar).
London, United kingdom
On site
14-11-2025
Company background Company brand
Company Name
McCabe & Barton
Job Title
Data Engineering Manager
Job Description
Job Title: Data Engineering Manager Role Summary: Lead the design, improvement, and maintenance of enterprise-grade data pipelines and architectures, steering a small team of data engineers within a fast-paced, Agile environment. Expectations: - Supervise and mentor 3–5 data engineers, fostering collaboration and continuous improvement. - Champion data quality, reliability, and performance across all pipeline stages. - Ensure alignment with cloud infrastructure best practices and regulatory requirements. Key Responsibilities: - Design, implement, and optimize end-to-end data pipelines that integrate Snowflake, Azure Data Factory, and Azure DevOps. - Conduct code reviews, enforce coding standards, and promote reusable, modular architecture. - Collaborate with data scientists, analysts, and product stakeholders to translate business requirements into scalable data solutions. - Monitor pipeline performance, troubleshoot issues, and implement automated alerting and remediation. - Lead continuous improvement initiatives for data tooling, documentation, and process efficiency. - Manage deployment pipelines, versioning, and release cycles in a DevOps-driven workflow. - Keep up-to-date with industry trends, emerging technologies, and best practices in data engineering. Required Skills: - Advanced SQL programming and experience with Snowflake. - Proficiency in Azure ecosystem (Azure Data Factory, Azure DevOps, Azure Storage). - Strong programming background (Python, and familiarity with Java or C++). - Proven experience managing small data engineering teams in Agile environments. - Excellent problem‑solving, communication, and stakeholder management skills. - Demonstrated ability to design scalable, fault‑tolerant data pipelines. Required Education & Certifications: - Bachelor’s or Master’s degree in Computer Science, Software Engineering, Information Systems, or related field. - Professional certifications such as Microsoft Certified: Azure Data Engineer Associate or Snowflake SnowPro Core are highly desirable.
London, United kingdom
Remote
25-11-2025
Company background Company brand
Company Name
McCabe & Barton
Job Title
Head of Development
Job Description
Job Title: Head of Development Role Summary: Interim leader responsible for directing a 10‑20 engineer team to deliver robust automation, AI/ML, and software development initiatives, driving organizational efficiency and aligning technical roadmaps with business goals. Expactations: Deliver a high‑velocity, scalable technology pipeline over a 6‑month engagement, ensuring measurable ROI on automation and AI projects, and establish lasting governance for coding standards, DevOps, and SRE practices. Key Responsibilities - Lead, mentor, and scale a distributed development team, setting culture, standards, and back‑log priorities. - Own end‑to‑end automation strategy, identifying and executing efficiency opportunities via GitHub Actions, Terraform, Ansible, and RPA tools. - Design, implement, and integrate AI/ML solutions, managing MLOps workflows and large‑language‑model deployments. - Define and enforce coding standards, architecture decisions, and technical debt reduction plans. - Develop and communicate technical roadmaps, SLO/SLA metrics, and ROI analyses to C‑suite stakeholders. - Implement CI/CD, containerization, observability, and incident‑management best practices across cloud‑native microservices. - Facilitate cross‑functional workshops, change‑management initiatives, and vendor/tool evaluations. Required Skills - Proven track record leading engineering teams of 10‑20+ members. - Deep expertise in automation (GitHub Actions, Terraform, Ansible, Airflow, Prefect, RPA). - Full‑stack development (Python, JavaScript/TypeScript, Java, Go) with modern frameworks (React, Node.js, Django, FastAPI). - Microservices, API design (REST/GraphQL), Docker, Kubernetes, cloud‑native patterns. - DevOps & SRE: CI/CD pipelines, infrastructure monitoring, log aggregation, SLO/SLA, incident response. - AI/ML: MLOps tools, LLM integration, proof‑of‑concept development. - Strategic business acumen: technical roadmaps, cost‑benefit analysis, ROI presentation. - Strong stakeholder management, communication, and organizational change skills. Required Education & Certifications - Bachelor’s or Master’s degree in Computer Science, Software Engineering, or related field. - Certifications: Certified Kubernetes Administrator (CKA) or similar; CI/CD/deployment tools certifications; Cloud provider certs (AWS, GCP, Azure) preferred.
London, United kingdom
On site
25-11-2025
Company background Company brand
Company Name
McCabe & Barton
Job Title
Data Engineer
Job Description
Job Title: Data Engineer Role Summary Design, build, and maintain scalable cloud‑based data infrastructure using Azure and Databricks. Ensure data pipelines, architecture, and analytics environments are reliable, performant, and secure. Expectations • Deliver high‑quality data solutions that support business analytics and data science initiatives. • Maintain uptime, performance, and security of data services. • Collaborate with cross‑functional teams to understand data requirements and priorities. Key Responsibilities • Design and implement data pipelines in Azure Data Factory, Databricks, and related Azure services. • Develop ETL/ELT processes to transform raw data into analytics‑ready formats. • Optimize pipeline performance and ensure high availability. • Architect and deploy scalable data lake solutions using Azure Data Lake Storage. • Apply governance, security, and compliance controls across the platform. • Use Terraform or equivalent IaC tools for reproducible deployments. • Develop and tune PySpark/Scala jobs within Databricks; implement medallion architecture and Delta Lake. • Manage cluster configurations and CI/CD pipelines for Databricks deployments. • Implement monitoring with Azure Monitor, Log Analytics, and Databricks tools; enforce SLAs and manage disaster recovery. • Collaborate with data scientists, analysts, and stakeholders; document technical designs, data flows, and procedures. Required Skills • 5+ years’ experience with Azure services (Data Factory, ADLS, SQL Database, Synapse Analytics). • Hands‑on expertise in Databricks, Delta Lake, and cluster management. • Proficiency in SQL and Python for pipeline development. • Familiarity with Git/GitHub, CI/CD workflows, and data modeling, governance, and security principles. • Experience with Terraform, Azure DevOps, or similar IaC/CI‑CD tools. • Knowledge of data quality frameworks and testing practices. Required Education & Certifications • Bachelor’s degree or equivalent in Computer Science, Information Technology, or related field. • Azure Data Engineer or Databricks certifications preferred.
London, United kingdom
On site
Mid level
03-12-2025