- Company Name
- TOHTEM
- Job Title
- Data Analyst / SQL – Paiements & Data Factory
- Job Description
-
**Job Title**
Data Analyst / SQL – Payments & Data Factory
**Role Summary**
Provide advanced analysis, data quality assurance, and automation of payment data pipelines within a Payment Data Factory squad. Deliver reliable reporting, Power BI dashboards, and operational data feeds while collaborating with business stakeholders, product owners, and technical teams.
**Expectations**
- Own end‑to‑end data lifecycle: ingestion, transformation, validation, and distribution.
- Translate business needs into functional and technical specifications (user stories, test cases).
- Maintain data integrity, security, and performance across cloud and on‑prem environments.
**Key Responsibilities**
- Collect, cleanse, and transform payment and customer data; enforce quality and governance policies.
- Develop and document SQL scripts, data models, and ETL processes for batch and real‑time pipelines.
- Design, build, and maintain Power BI dashboards; provide ad‑hoc reporting and insights to business users.
- Conduct data profiling, statistical analysis, and root‑cause investigations to support decision making.
- Collaborate in agile ceremonies: sprint planning, backlog grooming, and review meetings with product owners, data engineers, data visualization specialists, and business analysts.
- Create and execute test plans (unit, integration, user acceptance) for data transformations and reporting artifacts.
- Mentor and train business stakeholders on data usage, visualizations, and analytical techniques.
- Contribute to data architecture discussions, including schema design, partitioning, and indexing strategies.
- Automate and standardise data pipelines to improve scalability and reliability.
**Required Skills**
- Strong proficiency in **SQL** (T‑SQL, PL/SQL, Spark SQL).
- Experience with **Power BI**: data modeling, DAX, and dashboard publishing.
- Familiarity with **cloud data platforms** (Azure Data Factory, Azure Synapse, or GCP Dataflow/BigQuery).
- Knowledge of **big data** tools (Hadoop, Spark, Kafka) and data querying tools.
- Solid understanding of **data modelling** and **data governance** principles.
- Basic statistical analysis (mean, median, standard deviation, trend analysis).
- Agile/Scrum methodology: writing user stories, acceptance criteria, and participating in sprints.
- Excellent communication, stakeholder management, and analytical problem‑solving skills.
**Required Education & Certifications**
- Bachelor’s degree in Computer Science, Data Science, Information Systems, or related field.
- Certification in SQL (e.g., Microsoft SQL Server, Oracle, or Snowflake) and/or Power BI (e.g., Microsoft Certified: Data Analyst Associate) is preferred.