- Company Name
- Largeton Group
- Job Title
- Data Engineer
- Job Description
-
Job Title: Data Engineer
Role Summary
Lead and manage a data engineering team focused on complex health data projects. Design and deliver end‑to‑end data integration solutions, build ETL/ELT pipelines, and develop scalable data models. Serve as the primary SME on data architecture and business context within the healthcare domain.
Expactations
- Senior‑level experience in data engineering with proven leadership in research and implementation projects.
- Deep knowledge of healthcare data structures, regulations, and industry best practices.
- Ability to translate technical requirements into scalable code and solutions independently.
- Strong communication and stakeholder management skills (rating 9/10).
- Remote work capability aligned with EST or CST time zones.
Key Responsibilities
- Lead a multidisciplinary team in developing complex data models, maps, and workflows.
- Prioritize and execute tasks, acting as the primary contact for the team.
- Design, develop, and deliver robust end‑to‑end data integration solutions following best practices and test‑driven development.
- Build and maintain ETL/ELT pipelines between various source and target systems.
- Perform unit testing to ensure data accuracy and system reliability.
- Develop and optimize data models using slowly changing dimensions, star schemas, data vault, and other industry‑standard techniques.
- Apply data warehousing concepts (Kimball, Inmon, DDS, ODS) and manage data flow to BI tools.
- Write advanced SQL (DDL/DML, normalization, ACID transactions, performance/security) and implement performance tuning.
- Implement solutions using JavaScript, NoSQL/non‑relational databases, Snowflake, and Azure Cloud technologies.
- Identify and solve problems proactively, acting as a self‑starter.
Required Skills
- Advanced SQL (DDL/DML, normalization, ACID, performance tuning).
- JavaScript (senior level) and experience with NoSQL/non‑relational databases.
- Snowflake and Azure Cloud platform expertise.
- ETL/ELT pipeline development and data integration.
- Data modeling (slowly changing dimensions, star schema, data vault).
- Data warehousing methodologies (Kimball, Inmon, DDS, ODS).
- Unit testing and test‑driven development.
- Strong communication, stakeholder management, and leadership skills.
Required Education & Certifications
- Bachelor’s degree or higher in Computer Science, Data Engineering, Information Systems, or related field.
- Relevant certifications in Snowflake, Azure (e.g., Azure Data Engineer Associate), or data engineering best practices are preferred.