- Company Name
- Authentic
- Job Title
- Software Engineer - Data
- Job Description
-
Job title: Software Engineer – Data
Role Summary: Design, develop, and maintain end‑to‑end data pipelines and infrastructure that feed underwriting, claims, and operational reporting functions. Own data models from ingestion to delivery, ensuring quality, freshness, and accessibility for internal teams and external partners. Optimize warehouse architecture for analytical and operational workloads, and collaborate cross‑functionally to translate business needs into robust data solutions.
Expactations: 3‑5 years of hands‑on experience building production data pipelines; strong command of the modern data stack (Snowflake, dbt, orchestration tools such as Airflow); proficient in Python for data processing; solid grasp of data modeling (dimensional modeling, SCDs); experience with AWS data services and IaC; ownership mindset with clear requirements definition and proactive issue resolution; proven ability to build reliable systems with balanced velocity and caution.
Key Responsibilities: • Engineer and maintain scalable data pipelines powering underwriting models, claims processing, and reporting. • Own data models throughout ingestion, transformation, and delivery cycles, maintaining data quality and freshness. • Identify and resolve workflow bottlenecks to improve reliability, reduce latency, and support growth. • Design and maintain a data warehouse architecture optimized for analytical and operational use cases. • Work with engineering, product, and operations to translate data needs into well‑designed pipelines and models. • Implement monitoring, alerting, and quality checks to preempt downstream failures. • Contribute to data governance, including documentation, lineage, and access controls for regulatory compliance.
Required Skills: • 3–5 years in production data engineering. • Expertise with Snowflake, dbt, and orchestration (Airflow or equivalent). • Python programming for ETL/ELT. • Strong data modeling (dimensional, slowly changing dimensions). • Familiarity with AWS data services (Redshift, S3, Glue, etc.) and infrastructure‑as‑code (e.g., Terraform, CloudFormation). • Knowledge of data quality, monitoring, and governance practices. • Excellent communication, cross‑functional collaboration, and problem‑solving abilities.
Required Education & Certifications: • Bachelor’s degree in Computer Science, Engineering, Data Science, or related field (or equivalent experience). • Certifications in cloud data services (AWS Certified Data Analytics – Specialty, SnowPro Core, or similar) are a plus.