cover image
N Consulting Global

Data Engineer

Hybrid

Glasgow, United kingdom

Senior

Freelance

25-02-2026

Share this job:

Skills

Python SQL Data Engineering Apache Spark Architecture Data Architecture Programming apache AWS AWS Cloud Analytics Snowflake Spark PySpark

Job Specifications

Role: Data Engineer

Location: Glasgow, UK

Work Mode: Hybrid (3 Days from Office)

Contract Role: 6 months

Experience: 10+ years

Start Date: Only Immediate Joiners or candidates with max 2-3 weeks’ notice

No Visa Sponsorship

Must have skill: AWS cloud ecosystem, Snowflake, Python, and Apache Spark, Banking Domain

Role Overview

We are seeking an experienced Data Engineer with strong expertise in AWS cloud ecosystem, Snowflake, Python, and Apache Spark, along with proven experience in the banking domain. The ideal candidate will be responsible for designing, developing, and optimizing scalable data pipelines and modern data platforms that support analytics, reporting, and regulatory requirements.

________________________________________

Key Responsibilities

Design, build, and maintain scalable data pipelines using AWS services and modern data engineering practices.

Develop and optimize ETL/ELT workflows using Python and Apache Spark.

Implement and manage Snowflake data warehouse solutions including data modeling, performance tuning, and optimization.

Work closely with business stakeholders, data analysts, and architects to understand banking data requirements.

Integrate data from multiple banking systems such as payments, transactions, customer, and risk platforms.

Ensure data quality, governance, security, and compliance aligned with banking regulations.

Develop data ingestion frameworks for structured and semi-structured data.

Optimize data processing performance and cost efficiency within AWS environments.

Support real-time and batch data processing solutions.

Document data architecture, data flows, and technical processes.

________________________________________

Required Skills & Qualifications

10+ years of experience in Data Engineering.

Strong hands-on experience with AWS services (S3, Glue, Lambda, Redshift, EMR, Athena, Step Functions, IAM).

Extensive experience with Snowflake including schema design and performance tuning.

Strong programming skills in Python.

Hands-on experience with Apache Spark / PySpark.

Experience building ETL/ELT pipelines and data integration frameworks.

Strong SQL and data modeling skills.

Experience working with large-scale datasets.

About the Company

N Consulting is a global professional services company with leading capabilities in data analytics, digital, cloud and security. We come with experience and specialized skills across industries segments, we offer Consulting , IT Services, Technology and Operations services across the globe. Our quality of people is key to our success who deliver on time, serving clients across America, Europe and Asia. We embrace the power of change to create value and sustainable communities. Visit us at www.n-cons.co.uk. Know more