Job Specifications
I am hiring for AWS Data Engineer - (Python/Spark/Java/Kafka/Flink/CI/CD/Docker / Kubernetes/IAC/Banking/Fintech)
Location: London, UK 2-3 Days weekly onsite
Job Description:
We are seeking an experienced AWS Data Engineer to join our team at Barclays. The ideal candidate will be responsible for designing, developing, and maintaining cloud-based data solutions using AWS services. This role involves working in cross-functional teams to deliver scalable and efficient data pipelines and architectures.
Key Responsibilities:
Design, develop, and maintain AWS cloud data solutions using services like EC2, Lambda, Glue, S3, ECS, Redshift, and Kinesis.
Develop and maintain data pipelines using Python, Spark, SQL, Java (Spring Boot), and microservices.
Implement CI/CD pipelines and containerized solutions using Docker, Kubernetes, GitLab, Bitbucket, and Jenkins.
Perform data modeling, work with RDBMS/NoSQL databases, and design data warehouse architectures.
Apply Infrastructure as Code principles using Terraform or CloudFormation and build serverless architectures for data processing.
Collaborate with cross-functional teams and support BI tools such as PowerBI, Tableau, or SAP BO for reporting and analytics.
Key Skills:
AWS, EC2, Lambda, Glue, S3, ECS, Redshift, Kinesis, Python, Spark, Java, Spring Boot, Microservices, SQL, Kafka, Flink, CI/CD, Docker, Kubernetes, GitLab, Bitbucket, Jenkins, Terraform, CloudFormation, PowerBI, Tableau, SAP BO
About the Company
Welcome to GIOS: Your Partner in Innovation.
At GIOS Technology, we are committed to driving innovation and delivering exceptional solutions to clients worldwide. With a global presence spanning multiple countries, we cater to both private and public sector clients, providing comprehensive support and executing projects that meet their diverse needs.
Our strength is built on Collaboration, by harnessing our internal expertise and partnering with leading technology vendors such as Elastic, Qlik, AWS, Google, Microsoft, Orac...
Know more