cover image
Gazelle Global

Gazelle Global

www.gazellegc.com

11 Jobs

27 Employees

About the Company

Gazelle Global is an international recruitment consultancy established in 2011 through a unique partnership of IT implementation know-how and head hunting expertise with a vision to create a recruitment agency that delivers a best in class service whilst being a thoughtful and conscientious employer.

Our London HQ in collaboration with our offices in The Netherlands, Germany and Poland, serve clients and candidates across the UK and Europe. We pride ourselves on speed, transparency and user experience. Agility, at scale.

Gazelle Global has successfully grown across several sectors, including banking & financial services, telecommunications, pharmaceuticals, healthcare, manufacturing, oil & gas and retail. Internationally experienced across all major sectors and industries.

Our recruitment experts specialise in recruiting high-level talent for cloud, digital, e-commerce, finance, banking, actuarial/data science, fintech, security, risk, data/BI, devops, infrastructure, machine Learning, AI, ERP/CRM, sustainability and clean tech, to name a few. Our clients, candidates and consultants span the globe, acting as an international conduit, reducing the cost of logistics and bridging language barriers.

Furthermore, we pride ourselves in our mission and commitment to DEI&B. Our diverse leadership team, workforce and candidates brings an elevated need and responsibility to create a community and business that is representative of our clients, candidates and colleagues. By our DEI&B strategy and initiatives we demonstrate to our clients, candidates and colleagues our commitment to creating a community that is diverse, inclusive and inspires a sense of belonging.

Listed Jobs

Company background Company brand
Company Name
Gazelle Global
Job Title
Principal Amazon Connect Architect
Job Description
Job Title: Principal Amazon Connect Architect Role Summary: Lead strategy, design, and delivery of large‑scale Amazon Connect contact centre solutions, driving migration from legacy platforms, ensuring security, compliance, and operational excellence across multi‑region, high‑availability deployments. Expectations: Own end‑to‑end architecture, secure integration, cost‑efficient scaling, and cross‑functional stakeholder alignment; produce audit‑ready artefacts and meet strict regulatory standards (GDPR, PCI‑DSS, ISO 27001). Key Responsibilities: - Define and implement multi‑region, highly available Amazon Connect architecture (voice, chat, tasks) following AWS Well‑Architected Framework and enterprise security policies. - Lead migration from Avaya, Cisco, or Genesys to Amazon Connect, designing IVR flows, Lex/Chat bots, Contact Lens analytics, and agent‑assist workflows. - Architect workforce and quality management tools, integrating with Salesforce, ServiceNow, Dynamics, Pega, and internal systems. - Build API‑driven, serverless solutions using Lambda, API Gateway, EventBridge, Kinesis, DynamoDB, Glue, Athena; enable real‑time analytics and streaming data pipelines. - Implement IAM, KMS, VPC, PrivateLink, encryption, and secure networking; maintain GDPR/UK GDPR, PCI‑DSS, ISO 27001 compliance, PIA/DPIA processes, and data retention policies. - Establish CI/CD pipelines with CloudFormation, Terraform, or CDK; enforce shift‑left testing, versioning, and automation for flows and Lambda functions. - Define and track KPIs/SLAs (AHT, CSAT, abandonment), conduct capacity planning, resilience testing, and incident response. - Report executive stakeholders, build business cases, and present governance documentation. Required Skills: - 15+ years architectural/engineering experience, 10+ years in enterprise contact centre platforms, 5+ years Amazon Connect. - Deep expertise in AWS services: IAM, VPC, Lambda, API Gateway, EventBridge, CloudWatch, DynamoDB, Kinesis, Glue, Athena, Serverless Application Model (SAM). - IaC proficiency: CloudFormation, Terraform, or CDK; serverless development in Node.js or Python. - Sound knowledge of security, compliance, and regulated environments (healthcare, public sector). - Strong communication and stakeholder management skills. - Ability to produce audit‑ready artefacts and data protection compliance frameworks. Required Education & Certifications: - Bachelor's degree in Computer Science, Information Technology, or related field. - AWS Certified Solutions Architect – Professional (preferred). - Amazon Connect specialist certification (preferred).
Glasgow, United kingdom
Hybrid
Senior
31-10-2025
Company background Company brand
Company Name
Gazelle Global
Job Title
Azure Cosmos DB Developer
Job Description
Job Title: Azure Cosmos DB Developer Role Summary: Hands‑on contractor responsible for designing, developing, and optimizing cloud‑native, high‑performance applications that process real‑time financial data using Azure Cosmos DB (SQL API & Mongo API). Expectations: * Deliver secure, scalable data solutions for financial services. * Maintain CI/CD pipelines and CI practices. * Adhere to security, governance, and data‑protection standards. Key Responsibilities: * Architect applications with Azure Cosmos DB (partitioning, indexing, consistency). * Develop reusable libraries in C#/.NET or Node.js. * Build and maintain CI/CD pipelines with Azure DevOps. * Monitor and tune Cosmos DB performance (Azure Monitor, Application Insights). * Implement automated testing and unit test frameworks. * Collaborate with solution architects, DevOps, and microservices teams. Required Skills: * Advanced knowledge of Azure Cosmos DB (query optimization, throughput management). * Expertise in concurrency patterns, CLR, and scalable application design. * Experience with Azure services: Functions, App Services, AKS, Logic Apps. * Strong programming in C#/.NET or Node.js. * Proficient in Git, version control, and CI/CD tools. * Understanding of distributed systems, NoSQL data modeling, and security compliance. Required Education & Certifications: * Bachelor’s degree in Computer Science, Engineering, or related field, or equivalent experience. * Azure certification(s) (e.g., AZ‑900, DP‑203, or AZ‑300/AZ‑301) preferred.
London, United kingdom
On site
19-11-2025
Company background Company brand
Company Name
Gazelle Global
Job Title
Data engineer (Pyspark)
Job Description
Job title: Data Engineer (PySpark) Role Summary: Build, optimise, and maintain batch and streaming data pipelines in Azure environments, delivering reliable data assets for analytics and reporting across a financial markets infrastructure organization. Expectations: Deliver high-quality, performance‑optimised data flows that meet business and compliance requirements; collaborate with analysts and stakeholders to translate analytical needs into technical solutions; document and support knowledge sharing. Key Responsibilities: - Design, develop, and optimise scalable Spark pipelines (batch & streaming) using DataFrames, RDDs, and Spark SQL. - Build semantic models and dataflows aligned with analytics and reporting demands. - Apply data validation, cleansing, profiling, and quality assurance to ensure accuracy and consistency. - Implement access controls, data masking, and security protocols compliant with governance and regulatory standards. - Tune workload performance across Spark, Microsoft Fabric, and Azure services. - Translate business requirements into technical solutions in partnership with analysts and stakeholders. - Maintain clear documentation and contribute to internal knowledge repositories. Required Skills: - Strong experience developing on Microsoft Azure and Microsoft Fabric. - Proficiency in Spark programming (DataFrames, RDDs, Spark SQL) and Python/PySpark development, including notebook‑based workflows. - Hands‑on experience with Spark streaming and batch processing. - Delta Lake optimisation and Fabric Spark job development. - Solid Java programming knowledge and OOP fundamentals. - Experience with relational and NoSQL databases. - Familiarity with GitLab, unit testing, and CI/CD pipelines. - Strong troubleshooting skills and experience working in Agile environments. - Excellent stakeholder communication and collaboration. - Practical knowledge of ETL workflows, lakehouse architectures, dataflows, and semantic models. - Exposure to time‑series data, financial market feeds, transactional records, and risk‑related datasets. Required Education & Certifications: - Bachelor’s degree in Computer Science, Data Engineering, or related field (preferred). - Relevant Azure certifications (e.g., Azure Data Engineer Associate, Azure Databricks Specialist) are a plus.
London, United kingdom
Hybrid
19-11-2025
Company background Company brand
Company Name
Gazelle Global
Job Title
Collibra Developer
Job Description
**Job Title:** Collibra Developer **Role Summary:** Lead the implementation and governance of Collibra within a large enterprise. Design the Collibra operating model, engineer data lineage integrations, deploy Collibra Edge, and automate metadata ingestion to ensure the platform serves as the trusted backbone for business data. **Expectations:** - Shape and enforce Collibra operating processes across the organization. - Drive end‑to‑end data lineage, catalog enrichment, and governance capabilities. - Collaborate with ETL, database, BI, and cloud teams to integrate Collibra with enterprise systems. **Key Responsibilities:** 1. Design Collibra data models, including domains, communities, workflows, BPMN diagrams, Groovy scripts, and Java APIs. 2. Engineer technical lineage integration with ETL/ELT tools, database systems, BI platforms, and cloud services. 3. Deploy and configure Collibra Edge for real‑time metadata harvesting. 4. Develop and maintain API/SDK automation for metadata ingestion and catalog enrichment. 5. Execute complex SQL queries against enterprise databases and utilize them for data governance tasks. 6. Provide expertise on reference data and business glossary governance. 7. Work with AWS, GCP, or Azure environments to ensure seamless integration and data flow. **Required Skills:** - Deep knowledge of Collibra (model design, domains, communities, workflows, BPMN, Groovy, Java APIs). - Experience with technical lineage integration across ETL, databases, BI, and cloud ecosystems. - Proficient in Collibra Edge deployment and metadata harvesting. - Strong API/SDK development skills for metadata automation. - Advanced SQL skills and familiarity with enterprise databases. - Exposure to ETL/ELT and BI tooling. - Cloud experience (AWS, GCP, Azure). - Understanding of reference data and business glossary governance. **Required Education & Certifications:** - Bachelor’s degree in Computer Science, Information Systems, or related field (or equivalent practical experience). - Collibra certification or equivalent data governance certification is preferred.
South yorkshire, United kingdom
Hybrid
20-11-2025