- Company Name
- Optimal
- Job Title
- Principal Data SDET | Data Migration | SQL | Python | Stakeholder Management
- Job Description
-
Job Title: Principal Data SDET – Data Migration & Framework Architect
Role Summary: Lead the design, implementation, and maintenance of enterprise‑grade data validation and automation frameworks across data, backend, and UI layers. Drive end‑to‑end testing of large‑scale data migrations, ETL pipelines, and cloud‑based services, while collaborating with development, DevOps, and business stakeholders to embed quality at every stage of delivery.
Expectations:
- Minimum 4 years of hands‑on testing experience in a financial or investment management environment.
- Advanced expertise in data migration testing, SQL, and Python.
- Proficiency with Databricks, Snowflake, or equivalent cloud data platforms.
- Proven ability to architect and deliver robust automation frameworks from scratch.
- Strong stakeholder engagement skills; translate business requirements into test strategies.
- Eligibility to work in the UK (British Citizenship or Indefinite Leave to Remain).
Key Responsibilities:
- Own the full lifecycle of data validation and migration testing for large‑scale data pipelines and ETL processes.
- Build and maintain modular automation frameworks (SpecFlow, Playwright, Cypress, Selenium) covering UI, API, and end‑to‑end scenarios.
- Define and enforce performance benchmarks using tools such as JMeter, k6, and Gatling across APIs and distributed systems.
- Integrate automation suites into CI/CD pipelines (AWS, Azure DevOps, GCP) and ensure quality gates in production environments.
- Collaborate with developers, architects, and DevOps to shape architectural decisions that enhance reliability, maintainability, and performance.
- Mentor and lead a small team while remaining an active, 80 %+ hands‑on engineer.
- Communicate complex technical concepts to non‑technical stakeholders and translate business needs into testable outcomes.
Required Skills:
- Advanced SQL, Python, and scripting for data validation.
- Hands‑on experience with Databricks, Snowflake, or equivalent.
- Framework architecture and design pattern knowledge for scalable test automation.
- Proficiency with UI automation tools (Playwright, Selenium, Cypress, SpecFlow).
- Performance testing tools (JMeter, k6, Gatling).
- Cloud platform experience (AWS, Azure, GCP) and containerization (Docker, Kubernetes).
- CI/CD pipeline management and integration.
- Excellent written and verbal communication.
- Strong analytical, problem‑solving, and collaboration skills.
Required Education & Certifications:
- Bachelor’s degree in Computer Science, Software Engineering, Data Engineering, or related field (or equivalent practical experience).
- Certifications such as AWS Certified Developer, Azure DevOps Engineer, or Google Cloud Professional Data Engineer are advantageous but not mandatory.