- Company Name
- AXA Partners
- Job Title
- Data Engineer
- Job Description
-
**Job Title**
Data Engineer
**Role Summary**
Design, build, and maintain scalable data pipelines, data lakes, and analytical platforms in a cloud‑based big data environment. Enable reliable data access for analysts, scientists, and business users while enforcing governance, quality, and operational standards. Mentor peer data practitioners and contribute to the organization’s data strategy and roadmap.
**Expectations**
- Deliver high‑quality, production‑ready data solutions on schedule.
- Act as a single point of accountability for operational excellence of data assets.
- Bridge business requirements with technical implementation.
- Promote best practices, automation, and continuous improvement in data engineering processes.
**Key Responsibilities**
- Build, optimize, and automate ingestion, transformation, and exposure workflows for data lakes, data marts, and core target components.
- Manage & monitor batch and streaming pipelines using Azure services (ADLS, Databricks, ADF, Azure DevOps).
- Ensure data quality, reliability, and high availability of production data flows; implement end‑to‑end monitoring.
- Collaborate with data providers, IT owners, analysts, scientists, and data teams to define sourcing strategies and data lifecycle management.
- Participate in the definition of data road‑maps, standards, and operational guidelines under the central framework.
- Develop applications and solutions that meet central guidelines; perform code reviews, UAT support, and training for team members.
- Maintain documentation, schematics, and knowledge base entries for data assets.
- Provide technical support, troubleshooting, and issue resolution for data platform users.
- Contribute to Agile sprint planning, backlog grooming, and execution.
**Required Skills**
- Strong command of SQL and Python (mandatory).
- Proficient with Spark and large‑scale data processing.
- Experience with cloud Big Data stacks (Azure Data Lake Storage, Databricks, Azure Data Factory).
- Solid background in relational databases, data warehouses, and OLAP concepts.
- Familiarity with CI/CD, Git, and DevOps pipelines.
- Knowledge of BI tools (Spotfire, Power BI) – BI development is a plus.
- Ability to automate and monitor data pipelines, implement governance & data quality checks.
- Excellent problem‑solving, prioritization, and stakeholder communication skills.
- Fluency in English (mandatory).
**Required Education & Certifications**
- Bachelor’s degree in Computer Science, Information Technology, Data Engineering, or related field.
- Equivalent experience in big data environments may substitute for formal degree.
- Certifications (e.g., Microsoft Azure Data Engineer Associate, Databricks Certified Professional) are desirable but not mandatory.
---