cover image
Athsai

Athsai

www.athsai.co.uk

1 Job

5 Employees

About the Company

Welcome to Athsai, For Technical and Automotive sector Hiring

Our Journey:

Athsai emerged from a shared vision to bridge the gap between exceptional tech professionals and companies on the cutting edge of innovation. We recognised that the most ground breaking solutions arise when diverse minds collaborate, and that’s where our journey began.

Our Mission:

Our mission is simple yet transformative: To propel the tech industry forward by connecting the brightest minds with groundbreaking opportunities. We’re not just matching skills to job descriptions; we’re cultivating relationships that fuel growth, inspire innovation, and reshape industries.

Our Approach:

Talent-Centric:
Partnership Mindset:
Innovation-Focused:
Long-Term Impact:

Our Commitment:

Tech Enthusiasts: For tech enthusiasts, we offer a gateway to opportunities that align with their skills, passions, and ambitions. We’re committed to nurturing their growth and helping them forge meaningful careers.
Innovative Companies: For innovative companies, we’re a source of top-tier talent that fits seamlessly into your culture and projects. Our commitment to excellence ensures that we deliver nothing less than exceptional matches.
Join Us in Shaping Tomorrow:

As we continue to write our story, we invite you to be a part of it. Whether you’re a tech professional seeking new horizons or a company ready to redefine industries, Athsai is the platform where your journey begins.

Thank you for considering Athsai as your partner in innovation. Together, let’s redefine what’s possible in the world of technology.

Listed Jobs

Company background Company brand
Company Name
Athsai
Job Title
Data Engineer AWS
Job Description
Job Title: Senior Data Engineer – AWS (PySpark) Role Summary: Architect, develop and optimize large‑scale data pipelines using PySpark, Python and AWS services to support analytics, reporting and machine learning across batch and real‑time environments. Expactations: 6+ years of data engineering experience, strong in Python/PySpark/SQL, adept with AWS cloud services, Terraform and CI/CD, and proven ability to design secure, scalable ETL/ELT workflows. Key Responsibilities: - Design and build scalable PySpark/Python data pipelines for high‑volume processing. - Develop and manage ETL/ELT workflows ensuring data accuracy and performance. - Orchestrate workflows with Apache Airflow (scheduling, dependencies, failure handling). - Architect cloud‑native solutions on AWS (API Gateway, Lambda, Redshift, Glue, EMR, S3, CloudWatch, IAM). - Provision AWS infrastructure as code with Terraform. - Build and maintain CI/CD pipelines using GitHub Actions. - Optimize Spark jobs, tune performance and troubleshoot distributed systems. - Collaborate cross‑functionally on data architecture, governance and best practices. Required Skills: - Python, PySpark, SQL (production‑grade code). - Apache Spark internals, Apache Airflow. - ETL pipeline design and implementation. - AWS services (Lambda, API Gateway, Redshift, Glue, EMR, S3, CloudWatch, IAM). - Terraform for IaC. - GitHub Actions for CI/CD. - Troubleshooting and performance tuning of distributed jobs. - Strong communication and collaboration. - Optional: Kafka/Kinesis streaming, Docker/Kubernetes, data governance frameworks. Required Education & Certifications: - Bachelor’s degree in Computer Science, Engineering, Data Science or related field (or equivalent experience). - AWS certifications (e.g., Solutions Architect, Data Analytics) preferred.
England, United kingdom
Remote
Mid level
29-12-2025