- Company Name
- CommuniTech Recruitment Group
- Job Title
- Databricks Technical Lead. Fintech. up to £1000/ Day inside IR35. Greenfield Project. 6 Months rolling contract. Hybrid 3 Days a week in Central London office.
- Job Description
-
**Job Title**
Databricks Technical Lead
**Role Summary**
Lead the design, build, and delivery of a green‑field Lakehouse platform on Azure Databricks for a fintech client. Provide technical ownership of the Azure Databricks implementation, from proof‑of‑concept to production, ensuring scalability, cost efficiency, and compliance. Mentor the data engineering team, contribute to Agile delivery, and serve as the bridge between data operations, governance, and analytics teams.
**Expectations**
- Minimum 5 years of data engineering experience with strong Azure Databricks exposure.
- Proven track record delivering end‑to‑end cloud data platforms (Lakehouse, Delta Lake).
- Hands‑on with Azure services (ADLS Gen2, Azure DevOps, Azure IAM).
- Ability to work within a 6‑month rolling contract, supporting hybrid (remote + on‑site) engagement.
**Key Responsibilities**
- Own Azure Databricks implementation: evaluate capabilities, set success criteria, and recommend full‑scale adoption.
- Architect and deploy Lakehouse on Azure Databricks using Delta Lake, Unity Catalog, and MLflow.
- Design hybrid data integration strategies, integrating on‑premises sources with Azure.
- Build real‑time and batch pipelines using Structured Streaming, Spark SQL, and notebooks.
- Implement data lineage, governance, and quality controls with Unity Catalog or equivalent tools.
- Optimize cluster configurations, partitioning, indexing, and compaction for performance and cost.
- Mentor teammates, review code, and champion best practices.
- Participate in Agile ceremonies, deliver PoC prototypes, and iterate toward production.
**Required Skills**
- Azure Databricks administration (cluster, workspace, cost & performance tuning).
- Delta Lake, Unity Catalog, MLflow, and Lakehouse architecture design.
- Design of partitioning strategies, indexing, and compaction for large‑scale workloads.
- Real‑time streaming with Structured Streaming, Kafka or Event Hubs.
- Spark SQL, Delta Lake ACID, schema enforcement, and time‑travel.
- Git, CI/CD pipelines (Azure DevOps, GitHub Actions) for Databricks notebooks.
- Azure Data Lake Storage Gen2, Azure RBAC, Azure AD, and data security compliance.
- Strong SQL, data modeling, and workflow design.
**Required Education & Certifications**
- Bachelor’s degree in Computer Science, Engineering, Data Science, or related field.
- Azure Databricks Certified Developer or Azure Data Engineer Associate (preferred).