- Company Name
- Bayforce
- Job Title
- Data Quality Analyst
- Job Description
-
Job title: Senior Data Quality Analyst
Role Summary: Lead architect and engineer for scalable, automated data quality pipelines across cloud and on‑prem environments, integrating observability, monitoring, and remediation workflows via APIs and event‑driven architectures.
Expectations: Deliver end‑to‑end data quality automation, maintain governance and auditability standards, mentor peers, and advance CI/CD for DQ assets.
Key Responsibilities:
- Engineer idempotent, observable data quality pipelines in Python/SQL.
- Design and maintain reusable DQ rule libraries, validation frameworks, and APIs (REST/GraphQL, SDKs).
- Embed automated checks into ETL/ELT jobs; implement pre/post load validations and SLA/SLO monitoring.
- Build event‑driven alerts, webhooks, and routing to ticketing systems (Jira, ServiceNow).
- Develop Power BI dashboards and Power Apps workflows for KPI visualization and triage automation.
- Conduct root‑cause analysis using lineage, logs, metrics; automate remediation (rollback, quarantine, replay).
- Contribute to GitLab CI/CD pipelines: automated tests, code scanning, secrets management, environment promotion.
- Author runbooks, design docs, and operational playbooks; mentor analysts on scripting and API practices.
- Ensure compliance with risk, regulatory, and audit requirements.
Required Skills:
- Advanced Python and SQL (automation, reusable libraries, scheduled jobs).
- API engineering (REST/GraphQL, OAuth2, pagination, rate limits, retries, webhooks).
- Power BI / Power Apps for operational reporting and workflow automation.
- CI/CD with GitLab (or equivalent).
- Data profiling, rule design, and DQ metrics measurement (completeness, validity, accuracy, timeliness, uniqueness, consistency).
- Experience with DQ/observability platforms (Informatica Cloud DQ, Monte Carlo, Anomalo, Collibra OwlDQ) and their SDKs/APIs.
- Knowledge of ETL/ELT validation, data contracts, schema enforcement.
- Cloud platforms (Azure, Snowflake) scripting, secrets, job orchestration.
- Messaging/queueing for event‑driven triage.
- Strong written and verbal communication for cross‑functional collaboration.
- Ability to manage multiple concurrent automation initiatives.
Required Education & Certifications:
- Bachelor’s degree in Computer Science, Information Systems, or related field, or 9 years of combined education and work experience with a minimum of 5 years in related roles.
- Certifications in data engineering, cloud (Azure, Snowflake), or data quality platforms are preferred.
Wilmington, United states
Hybrid
Mid level
11-02-2026