cover image
Fractal

Fractal

fractal.ai

2 Jobs

6,068 Employees

About the Company

Fractal is one of the most prominent providers of Artificial Intelligence to Fortune 500(r) companies. Fractal's vision is to power every human decision in the enterprise, and bring AI, engineering, and design to help the world's most admired companies.

Fractal's businesses include Crux Intelligence (AI driven business intelligence), Eugenie.ai (AI for sustainability), Asper.ai (AI for revenue growth management) and Senseforth.ai (conversational AI for sales and customer service). Fractal incubated Qure.ai, a leading player in healthcare AI for detecting Tuberculosis and Lung cancer.

Fractal currently has 4000+ employees across 16 global locations, including the United States, UK, Ukraine, India, Singapore, and Australia. Fractal has been recognized as 'Great Workplace' and 'India's Best Workplaces for Women' in the top 100 (large) category by The Great Place to Work(r) Institute; featured as a leader in Customer Analytics Service Providers Wave(tm) 2021, Computer Vision Consultancies Wave(tm) 2020 & Specialized Insights Service Providers Wave(tm) 2020 by Forrester Research Inc., a leader in Analytics & AI Services Specialists Peak Matrix 2022 by Everest Group and recognized as an 'Honorable Vendor' in 2022 Magic Quadrant(tm) for data & analytics by Gartner Inc. For more information, visit fractal.ai

Listed Jobs

Company background Company brand
Company Name
Fractal
Job Title
Decision Scientist - Tableau Visualization Engineer
Job Description
**Job Title:** Decision Scientist – Tableau Visualization Engineer **Role Summary:** Design, build, and maintain high‑impact Tableau dashboards that translate predictive modeling outputs into actionable insights for attrition initiatives. Collaborate with data science, engineering, and business stakeholders to deliver scalable, data‑driven visual analytics integrated with Snowflake and the Next‑Gen Platform. **Expectations:** - Deliver dashboards on schedule and support UAT and production deployment. - Ensure optimal performance, data quality, and visual clarity. - Communicate insights effectively to non‑technical audiences. **Key Responsibilities:** - Lead end‑to‑end development of attrition risk exposure and model monitoring dashboards. - Translate analytical outputs and model results into intuitive, interactive visualizations. - Integrate Tableau dashboards with Snowflake data sources, APIs, and real‑time data pipelines. - Implement advanced analytics (parameters, LOD calculations, performance tuning). - Create and enforce data validation and quality checks. - Document specifications, lead requirement workshops, and manage stakeholder expectations. - Collaborate with MLOps and platform teams to embed dashboards into OrgSF, Slack, CS, and other applications. **Required Skills:** - Expert Tableau Desktop and Server – design, optimization, publishing. - SQL (Snowflake preferred) – data extraction, transformation, and blending. - Advanced Tableau features: LOD expressions, parameters, data blending, data source filters. - Experience with API‑based data integration and real‑time insights pipelines. - Familiarity with Python or Airflow integration with Tableau (preferred). - Strong stakeholder management, collaboration, and data storytelling. - Detail‑oriented, self‑driven, thrives in fast‑paced environments. **Required Education & Certifications:** - Bachelor’s degree in Computer Science, Information Systems, Data Analytics, or related field. - Tableau Desktop and Tableau Server certifications; visual storytelling certification preferred.
United states
Remote
18-11-2025
Company background Company brand
Company Name
Fractal
Job Title
Data Architect - Azure & Databricks
Job Description
**Job Title:** Data Architect – Azure & Databricks **Role Summary:** Design, build, and govern scalable lakehouse data platforms on Azure Databricks for enterprise clients, focusing on data modernization, advanced analytics, and AI/ML in the healthcare payer domain. Serve as a technical leader, delivering end‑to‑end data pipelines, ensuring high performance, security, compliance, and stakeholder alignment. **Expectations:** - Deliver robust, production‑grade data architectures that support multi‑layer models (Bronze, Silver, Gold). - Lead cross‑functional workshops to translate business requirements into technical solutions. - Optimize Spark workloads for cost and performance, maintain observability, and enforce security and governance. - Champion the adoption of AI copilot tools and agentic workflows to accelerate development. **Key Responsibilities:** - Architect and maintain Databricks Lakehouse platforms, including Delta Lake, Unity Catalog, and DLT pipelines. - Define data models (star/snowflake, dimensional) and create data marts via Databricks SQL warehouse. - Integrate data from ERP, POS, CRM, e‑commerce, and third‑party sources using Azure Data Lake Storage Gen2. - Optimize Spark jobs (OPTIMIZE, VACUUM, ZORDER, Time Travel) and configure cluster sizing for performance and cost. - Implement monitoring, alerting, and performance tuning using Databricks Observability and native cloud tools. - Design secure, compliant data architectures with RBAC, encryption, and data lineage. - Lead data governance initiatives (Data Fitness Index, quality scores, metadata cataloging). - Mentor and collaborate with data engineers and data scientists to deliver end‑to‑end pipelines. - Evaluate new Databricks features and generative AI tools, proposing pilots for platform enhancement. **Required Skills:** - 12–18 years of data engineering experience, including 5+ years on Azure Databricks/Apache Spark. - Proficient in PySpark, SQL, Delta Lake, DLT, Databricks Workflows, and MLflow. - Strong background in lakehouse architecture, bronze/silver/gold layering, and ETL/ELT pipelines. - Expertise in data modeling (star/snowflake, dimensional), Delta Lake optimization, and performance tuning. - Knowledge of Azure Data Lake Storage Gen2, ingestion from structured/unstructured sources, and REST APIs. - Experience with Unity Catalog, RBAC, encryption, and data governance practices. - Familiarity with BI tools (Power BI, Tableau) and Databricks SQL warehouse. - Ability to use AI code assistants (GitHub Copilot, Databricks Assistant) and advocate agentic workflows. **Required Education & Certifications:** - Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. - Certifications in Azure Data Engineering, Databricks Certified Professional, or similar are preferred.
Palo alto, United states
On site
Senior
23-12-2025