- Company Name
- Mentmore Recruitment
- Job Title
- D365 Data Engineer (Contract)
- Job Description
-
**Job Title**
D365 Data Engineer (Contract)
**Role Summary**
Build, enhance, and execute end‑to‑end data migration pipelines for Dynamics 365 Finance & Operations. Work with Azure Data Factory and Azure Databricks to extract, transform, and load data from legacy systems (e.g., Access Dimensions, Concept Evolution) into D365. Collaborate with data, program, and functional teams to ensure accuracy, auditability, and timely delivery across multiple concurrent migrations.
**Expectations**
- Deliver high‑quality, scalable ETL pipelines within agreed timelines.
- Maintain clear, auditable documentation for data mapping, transformations, and test plans.
- Communicate progress, blockers, and status updates to Data Lead, PMO, and business SMEs.
- Meet data quality thresholds and KPIs through iterative cleansing and validation cycles.
**Key Responsibilities**
1. **Data Migration Process & Tooling** – design, develop, and maintain ADF pipelines; integrate with Azure Databricks for downstream transformations.
2. **Pipeline Development** – write Python/SQL scripts in Databricks for data extraction, transformation, and loading.
3. **Process Review** – collaborate with Data Lead and D365 team to refine migration processes for efficiency and auditability.
4. **Reporting & Analytics** – support Power BI dashboards for data quality, migration status, and analytics.
5. **Data Discovery & Profiling** – analyze system documentation, conduct profiling, and quantify data volumes for planning.
6. **Data Extraction** – define and implement scope criteria, work with SMEs and integrators to capture master and transactional data.
7. **Quality & Cleansing** – identify issues, develop remediation strategies, and iterate until KPIs are met.
8. **Mapping & Transformation** – create source‑to‑target field mappings, validation rules, and transformation logic in code.
9. **DMF Configuration** – support creation and update of Data Management Framework entities and custom data entities.
10. **Dry‑Run & Cutover Management** – execute trial migrations, validate with stakeholders, document audit trail, and perform production cutover.
**Required Skills**
- Azure Data Factory (ADF) – pipeline creation, orchestration, and monitoring.
- Azure Databricks – scalable coding in Python/SQL for ETL.
- Dynamics 365 Finance & Operations – understanding of core model and Data Management Framework.
- Data profiling, quality analysis, and cleansing techniques.
- Power BI – construction of dashboards for data monitoring.
- SQL – schema design, data extraction, and transformation logic.
- Python – scripting for data manipulation and automation.
- ETL & migration best practices, including performance tuning for large datasets.
- Documentation – clear mapping, transformation logic, and test documentation.
- Collaboration and communication with technical and business stakeholders.
- Problem‑solving and analytical thinking.
**Required Education & Certifications**
- Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or related field (or equivalent professional experience).
- Preferred: Microsoft Azure Data Engineer Associate (DP-203) or equivalent Azure certifications.
- Experience with Dynamics 365 F&O data migration is strongly preferred.