- Company Name
- dataroots
- Job Title
- Data platform engineer
- Job Description
-
Job title: Data Platform Engineer
Role Summary: Design, build, and maintain scalable, resilient data platforms that enable data engineers, analysts, and scientists to transform raw data into actionable insights efficiently and securely. Own infrastructure, tooling, and standards across cloud, orchestration, governance, and observability to balance performance, cost, and reliability.
Expactations: Own the end‑to‑end data platform lifecycle, from cloud provisioning to pipeline execution and monitoring. Collaborate with cross‑functional teams to translate data requirements into platform capabilities. Ensure high availability, data security, compliance, and observability while continuously optimizing for performance and cost. Drive DataOps and MLOps practices to deliver production‑ready solutions.
Key Responsibilities:
- Design and implement scalable cloud‑based data platforms (AWS, Azure, GCP, or hybrid).
- Build and maintain ELT pipelines using Airflow, Prefect, Dagster, Spark, dbt, or equivalent tools.
- Provision and manage infrastructure with Terraform, Docker, Kubernetes, and related tooling.
- Deploy, configure, and optimize data services such as Databricks, Snowflake, Azure Data Services, or AWS data services.
- Implement data governance, security, and compliance controls across the platform.
- Establish observability, monitoring, and alerting for data pipelines and infrastructure.
- Create and maintain CI/CD pipelines for data and infrastructure code.
- Troubleshoot platform issues, perform root‑cause analysis, and implement preventive measures.
- Mentor and collaborate with data engineers, scientists, and analysts to promote best practices.
Required Skills:
- Strong data engineering and cloud architecture expertise.
- Proficiency in ELT pipeline tools: Airflow, Prefect, Dagster, Spark, dbt, etc.
- Software engineering skills: Git, Python, SQL.
- Cloud engineering: Terraform, Docker, Kubernetes, CI/CD.
- Experience with at least one major data platform (Databricks, Snowflake, Fabric, Azure Data Services, AWS data services).
- Knowledge of data governance, security, and compliance frameworks.
- Familiarity with observability tools (monitoring, logging, alerting).
- Excellent problem‑solving, communication, and collaboration skills.
- Agile mindset and ability to adapt to evolving data needs.
Required Education & Certifications:
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, Software Engineering, or related field.
- Cloud platform certifications (AWS Certified Data Analytics, Azure Data Engineer Associate, GCP Professional Data Engineer) preferred.
- Databricks or Snowflake certifications are a plus.