cover image
Fractal

Fractal

fractal.ai

2 Jobs

6,068 Employees

About the Company

Fractal is one of the most prominent providers of Artificial Intelligence to Fortune 500(r) companies. Fractal's vision is to power every human decision in the enterprise, and bring AI, engineering, and design to help the world's most admired companies.

Fractal's businesses include Crux Intelligence (AI driven business intelligence), Eugenie.ai (AI for sustainability), Asper.ai (AI for revenue growth management) and Senseforth.ai (conversational AI for sales and customer service). Fractal incubated Qure.ai, a leading player in healthcare AI for detecting Tuberculosis and Lung cancer.

Fractal currently has 4000+ employees across 16 global locations, including the United States, UK, Ukraine, India, Singapore, and Australia. Fractal has been recognized as 'Great Workplace' and 'India's Best Workplaces for Women' in the top 100 (large) category by The Great Place to Work(r) Institute; featured as a leader in Customer Analytics Service Providers Wave(tm) 2021, Computer Vision Consultancies Wave(tm) 2020 & Specialized Insights Service Providers Wave(tm) 2020 by Forrester Research Inc., a leader in Analytics & AI Services Specialists Peak Matrix 2022 by Everest Group and recognized as an 'Honorable Vendor' in 2022 Magic Quadrant(tm) for data & analytics by Gartner Inc. For more information, visit fractal.ai

Listed Jobs

Company background Company brand
Company Name
Fractal
Job Title
Solution Engineer
Job Description
Job Title: Solution Engineer – Data & Analytics Platforms Role Summary: Act as the technical bridge between product intake and delivery teams for data and analytics initiatives on Databricks and/or Google Cloud Platform (GCP). Translate product requirements into pragmatic, end‑to‑end architectural solutions that are production‑ready, scalable, and aligned with platform standards, enabling rapid, high‑quality delivery. Expectations: - Deliver architecture that balances clarity with agility, providing enough detail to start build while refining iteratively. - Ensure feasibility, data quality, security, and governance compliance before handoff. - Maintain consistent design standards across multiple use cases and domains. - Communicate design intent clearly to both technical and non‑technical stakeholders. Key Responsibilities: - Review Product Requirements Documents (PRDs) to assess functional needs, data sources, success criteria, and value drivers. - Identify gaps, ambiguities, and dependencies; request clarification from product and data owners. - Design end‑to‑end solutions covering data ingestion, transformations, analytics, modeling, and consumption patterns using Databricks or GCP services. - Determine appropriate design depth: “just enough” architecture vs. full design, and decide when to build a proof of concept. - Evaluate use cases against platform readiness, data availability, security, governance, and operational constraints. - Provide estimates of effort, complexity, and technical risk to inform delivery sequencing. - Serve as the technical handoff point to development teams, clarifying architectural intent and adjusting designs as implementation evolves. - Collaborate with product, platform, data engineering, analytics, and data science teams to refine design standards, documentation expectations, and feedback loops. Required Skills: - Strong experience designing data engineering and analytics solutions on Databricks (lakehouse patterns) and/or GCP data services (BigQuery, Dataflow, Cloud Storage, AI Platform). - Deep understanding of data pipelines, ETL/ELT processes, and reporting/analytical use cases. - Proven ability to translate evolving business requirements into scalable, production‑grade technical designs. - Familiarity with agile delivery practices and iterative design refinement. - Excellent communication and influence skills, able to explain technical decisions to cross‑functional stakeholders. - Ability to evaluate feasibility, data quality, security, governance, and operational constraints. Required Education & Certifications: - Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related technical field (or equivalent practical experience). - Relevant certifications preferred: - Databricks Certified Professional Data Engineer or Databricks Certified Data Engineer. - Google Professional Data Engineer or Google Cloud Certified – Professional Cloud Architect. (End)
New york, United states
On site
23-02-2026
Company background Company brand
Company Name
Fractal
Job Title
Cloud Platform Engineer
Job Description
**Job Title** Cloud Platform Engineer **Role Summary** Lead the design, deployment, and operation of scalable, secure cloud infrastructure on AWS, leveraging IaC, CI/CD, serverless, and container technologies to support high‑volume data platforms and AI services. **Expectations** - Deliver production‑grade AWS solutions that meet performance, reliability, and security requirements. - Mentor and guide a small team of engineers and stakeholders. - Communicate effectively with technical and non‑technical customers. - Maintain comprehensive documentation and provide regular support sessions. **Key Responsibilities** - Design, provision, and maintain AWS cloud resources using Terraform, Argo CD, and CloudFormation. - Build, test, and deploy serverless applications with Lambda, ECS, and ECR. - Develop and manage CI/CD pipelines (Jenkins, CodePipeline, CodeCommit). - Implement security controls: VPC, IAM roles, security groups, NACLs. - Operate distributed data platforms (Airflow, Snowflake, dbt, SageMaker). - Provide observability and alerting via CloudWatch, CloudTrail, and Slack integration. - Resolve incidents and host bi‑weekly open office hours for customer support. - Upgrade Airflow to latest versions, enhance orchestration utilities (dbt, SageMaker, Snowflake, etc.). - Document processes in Confluence and manage API ingestion utilities. **Required Skills** - Deep expertise in AWS architecture, administration, and security. - Proficient in Terraform, Argo CD, Docker, ECS, Kubernetes, and CI/CD tooling. - Strong experience with Python (production code) and SQL. - Knowledge of serverless development (Lambda, API Gateway). - Familiarity with CloudWatch, CloudTrail, CloudWatch Insights, and observability tooling. - Git proficiency and DevOps mindset with emphasis on DevSecOps practices. - Excellent problem‑solving and communication abilities. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Engineering, or related field. - Minimum 7 years of experience in platform or cloud engineering roles. - AWS Certified Solutions Architect – Associate/Professional (preferred). - Additional certifications such as Terraform Certified Engineer or Kubernetes Administrator are advantageous.
California, United states
On site
Senior
12-03-2026