- Company Name
- Fractal
- Job Title
- Solution Engineer
- Job Description
-
Job Title: Solution Engineer – Data & Analytics Platforms
Role Summary:
Act as the technical bridge between product intake and delivery teams for data and analytics initiatives on Databricks and/or Google Cloud Platform (GCP). Translate product requirements into pragmatic, end‑to‑end architectural solutions that are production‑ready, scalable, and aligned with platform standards, enabling rapid, high‑quality delivery.
Expectations:
- Deliver architecture that balances clarity with agility, providing enough detail to start build while refining iteratively.
- Ensure feasibility, data quality, security, and governance compliance before handoff.
- Maintain consistent design standards across multiple use cases and domains.
- Communicate design intent clearly to both technical and non‑technical stakeholders.
Key Responsibilities:
- Review Product Requirements Documents (PRDs) to assess functional needs, data sources, success criteria, and value drivers.
- Identify gaps, ambiguities, and dependencies; request clarification from product and data owners.
- Design end‑to‑end solutions covering data ingestion, transformations, analytics, modeling, and consumption patterns using Databricks or GCP services.
- Determine appropriate design depth: “just enough” architecture vs. full design, and decide when to build a proof of concept.
- Evaluate use cases against platform readiness, data availability, security, governance, and operational constraints.
- Provide estimates of effort, complexity, and technical risk to inform delivery sequencing.
- Serve as the technical handoff point to development teams, clarifying architectural intent and adjusting designs as implementation evolves.
- Collaborate with product, platform, data engineering, analytics, and data science teams to refine design standards, documentation expectations, and feedback loops.
Required Skills:
- Strong experience designing data engineering and analytics solutions on Databricks (lakehouse patterns) and/or GCP data services (BigQuery, Dataflow, Cloud Storage, AI Platform).
- Deep understanding of data pipelines, ETL/ELT processes, and reporting/analytical use cases.
- Proven ability to translate evolving business requirements into scalable, production‑grade technical designs.
- Familiarity with agile delivery practices and iterative design refinement.
- Excellent communication and influence skills, able to explain technical decisions to cross‑functional stakeholders.
- Ability to evaluate feasibility, data quality, security, governance, and operational constraints.
Required Education & Certifications:
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related technical field (or equivalent practical experience).
- Relevant certifications preferred:
- Databricks Certified Professional Data Engineer or Databricks Certified Data Engineer.
- Google Professional Data Engineer or Google Cloud Certified – Professional Cloud Architect.
(End)