cover image
CommuniTech Recruitment Group

CommuniTech Recruitment Group

www.communitech.co.uk

4 Jobs

1 Employees

About the Company

CommuniTech are an exciting name in Tech Recruitment, seamlessly connecting the client & candidate communities to deliver exceptional technical talent to tech-driven companies. Ensuring that together, they will thrive, exceed, and achieve.

By striving to intertwine the communities, we get to know our clients and candidates better than ever before. Providing recruitment solutions that deliver an individual experience tailored to your needs.

Listed Jobs

Company background Company brand
Company Name
CommuniTech Recruitment Group
Job Title
Data Developer. C# + (either Clickhouse, SingleStore, Rockset, TimescaleDB) + open standard datalake (e.g. Iceberg or Delta tables, Apache Spark, Column store). £700/ Day. 6 month rolling. Hybrid.
Job Description
**Job title** Data Developer **Role Summary** Build and maintain high‑performance analytical data pipelines and storage using C# and a modern analytical database (ClickHouse, SingleStore, Rockset, TimescaleDB). Enable ingestion, processing, and analysis of large data volumes into open‑standard data lake formats (Iceberg, Delta, Parquet) via Apache Spark or column‑store solutions. Deploy containerised services and maintain CI/CD pipelines in an Agile environment. **Expectations** - Deliver stable, scalable data services for commodity trading analytical workloads. - Apply best‑practice software design (SOLID, IOC, automated testing, logging, monitoring). - Collaborate with DevOps, security, and data‑engineering teams. - Ensure code quality through automated CI/CD, unit/integration tests, and code reviews. - Contribute to continuous improvement of data architecture and tooling. **Key Responsibilities** - Design, develop, and maintain C# backend services for data ingestion, transformation, and storage. - Write and optimise queries for chosen analytical database; manage schema and performance tuning. - Implement data lake ingestion pipelines using tools such as Apache Spark, Iceberg, Delta Lake or equivalent open‑standard formats. - Containerise applications (Docker, Kubernetes) and manage deployments. - Integrate authentication (OAuth2) and secure data access. - Develop and maintain automated CI/CD pipelines in Azure DevOps/TFS/Git. - Participate in Agile ceremonies (Scrum) and sprint planning. - Monitor system health, capacity, and performance; respond to incidents. - Document architecture, processes, and code conventions. **Required Skills** - 3+ years commercial .NET C# backend development. - Proficient in SQL Server and one or more analytical databases (ClickHouse, SingleStore, Rockset, TimescaleDB). - Experience with containerised application development (Docker, Kubernetes). - Strong knowledge of SOLID, IOC, unit testing frameworks, logging, and monitoring. - Familiarity with open‑standard data formats (Parquet, Iceberg, Delta) and big‑data tools (Apache Spark). - Experience with Azure DevOps/TFS, Git, continuous integration, automated testing, and deployment pipelines. - Understanding of authentication protocols (OAuth2). - Agile/Scrum development experience. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Software Engineering, or related field. - Relevant certifications (e.g., Microsoft Certified: Azure Developer Associate, Certified Scrum Developer) preferred but not mandatory.
London, United kingdom
Hybrid
29-10-2025
Company background Company brand
Company Name
CommuniTech Recruitment Group
Job Title
C# Data Centric Developer with experience with real time analytical databases (non relational). 6 month rolling contract. £700/ Day Inside IR35. Hybrid 2 Days in Central London.
Job Description
**Job title** C# Data Centric Developer **Role Summary** Deliver backend solutions in .NET C# focused on real‑time analytical databases. Build and maintain data ingestion pipelines, enable analytics workflows and ensure high‑quality, maintainable code on an agile contract basis. **Expectations** - 6‑month rolling contract, day rate £700, IR35 in‑scope. - Hybrid schedule: 2 days onsite in London, remainder remote. - Active participation in Scrum ceremonies and continuous delivery pipelines. **Key Responsibilities** - Design, develop, and deploy C# backend services targeting real‑time analytical databases (ClickHouse, SingleStore, Rockset, TimescaleDB). - Implement data ingestion pipelines that load large batch and streaming data into analytical stores or open‑standard data lakes (Iceberg, Delta, Parquet). - Containerise applications (Docker), manage deployments on cloud or on‑premise platforms. - Apply SOLID principles, dependency injection, unit/integration testing, and automated CI/CD frameworks. - Configure and maintain monitoring, logging, and performance profiling for services. - Collaborate with data engineers, analysts and product owners to define data models and API contracts. **Required Skills** - 3+ years commercial .NET C# development (backend). - Strong experience with SQL Server and at least one non‑relational analytical database (ClickHouse, SingleStore, Rockset, TimescaleDB). - Proficiency in containerised application development (Docker) and deployment pipelines. - Familiarity with data lake formats (Parquet, Iceberg, Delta) and Spark or similar column‑store processing. - Solid understanding of test automation, IoC, SOLID, logging, and monitoring. - Knowledge of CI/CD (Azure DevOps or TFS), Git, and automated testing. - Experience with authentication protocols (OAuth2) is a plus. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Software Engineering or related technical field, or equivalent work experience. - No specific certifications required; relevant Microsoft, Azure or database vendor certifications are advantageous.
London, United kingdom
On site
07-11-2025
Company background Company brand
Company Name
CommuniTech Recruitment Group
Job Title
Data Engineer - Data Bricks Greenfield Implementation. £800/ Day Inside IR35. 6 month rolling long term contract. Hybrid 2 Days/ week in Central London Office.
Job Description
Job Title: Data Engineer – Databricks Greenfield Implementation Role Summary: Lead the design, implementation, and optimisation of a Lakehouse architecture on Azure Databricks for a leading energy trading firm. Drive proof‑of‑concept development, hybrid data integration, streaming pipelines, and data governance to enable real‑time analytics and AI workloads. Expectations: * Deliver end‑to‑end Databricks solutions that meet business objectives within a 6‑month contract. * Collaborate with cross‑functional teams (data science, analytics, governance) and participate in agile ceremonies. * Maintain high data quality, lineage, security, and cost‑efficiency across all pipelines. Key Responsibilities: * Evaluate and showcase Databricks capabilities in proof‑of‑concept projects. * Design and build a scalable Lakehouse using Delta Lake, Unity Catalog, and Azure Data Lake Storage. * Integrate on‑premise data sources with Azure Databricks, addressing connectivity, security, and performance. * Develop structured streaming pipelines (Databricks Structured Streaming, Kafka/Event Hubs). * Build ingestion and transformation workflows with notebooks, Spark SQL, and automate with CI/CD. * Implement data lineage, governance, and observability; ensure compliance with regulatory standards. * Apply schema enforcement, validation, and error handling to maintain data quality. * Collaborate on schema design for analytics and AI workloads. * Participate in agile planning, stand‑ups, and iterative delivery. Required Skills: * Proven experience with Azure Databricks (cluster configuration, workspace management, cost/performance optimisation). * Expertise in lakehouse architecture (Delta Lake, partitioning, indexing, compaction). * Hybrid integration knowledge (secure connectivity, data movement, performance tuning). * Real‑time pipeline development (Structured Streaming, Kafka, Event Hubs). * Advanced transformation & lineage implementation (Databricks notebooks, Spark SQL, Unity Catalog). * Deep understanding of Apache Spark optimisation for batch and streaming. * Strong SQL skills; Delta Lake ACID, schema enforcement, time travel. * Git‑based version control, CI/CD in Azure DevOps or GitHub Actions. * Azure Data Lake Storage, RBAC, Unity Catalog, and cloud security fundamentals. Required Education & Certifications: * Bachelor’s degree in Computer Science, Engineering, or related field. * Relevant certifications such as Azure Data Engineer Associate, Databricks Certified Associate Developer for Apache Spark, and/or Certified Data Professional (CDP) are advantageous.
London, United kingdom
On site
25-01-2026
Company background Company brand
Company Name
CommuniTech Recruitment Group
Job Title
Databricks Technical Lead. Fintech. up to £1000/ Day inside IR35. Greenfield Project. 6 Months rolling contract. Hybrid 3 Days a week in Central London office.
Job Description
**Job Title** Databricks Technical Lead **Role Summary** Lead the design, build, and delivery of a green‑field Lakehouse platform on Azure Databricks for a fintech client. Provide technical ownership of the Azure Databricks implementation, from proof‑of‑concept to production, ensuring scalability, cost efficiency, and compliance. Mentor the data engineering team, contribute to Agile delivery, and serve as the bridge between data operations, governance, and analytics teams. **Expectations** - Minimum 5 years of data engineering experience with strong Azure Databricks exposure. - Proven track record delivering end‑to‑end cloud data platforms (Lakehouse, Delta Lake). - Hands‑on with Azure services (ADLS Gen2, Azure DevOps, Azure IAM). - Ability to work within a 6‑month rolling contract, supporting hybrid (remote + on‑site) engagement. **Key Responsibilities** - Own Azure Databricks implementation: evaluate capabilities, set success criteria, and recommend full‑scale adoption. - Architect and deploy Lakehouse on Azure Databricks using Delta Lake, Unity Catalog, and MLflow. - Design hybrid data integration strategies, integrating on‑premises sources with Azure. - Build real‑time and batch pipelines using Structured Streaming, Spark SQL, and notebooks. - Implement data lineage, governance, and quality controls with Unity Catalog or equivalent tools. - Optimize cluster configurations, partitioning, indexing, and compaction for performance and cost. - Mentor teammates, review code, and champion best practices. - Participate in Agile ceremonies, deliver PoC prototypes, and iterate toward production. **Required Skills** - Azure Databricks administration (cluster, workspace, cost & performance tuning). - Delta Lake, Unity Catalog, MLflow, and Lakehouse architecture design. - Design of partitioning strategies, indexing, and compaction for large‑scale workloads. - Real‑time streaming with Structured Streaming, Kafka or Event Hubs. - Spark SQL, Delta Lake ACID, schema enforcement, and time‑travel. - Git, CI/CD pipelines (Azure DevOps, GitHub Actions) for Databricks notebooks. - Azure Data Lake Storage Gen2, Azure RBAC, Azure AD, and data security compliance. - Strong SQL, data modeling, and workflow design. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Engineering, Data Science, or related field. - Azure Databricks Certified Developer or Azure Data Engineer Associate (preferred).
London, United kingdom
Hybrid
Senior
25-01-2026