cover image
ideaHelix

ideaHelix

www.ideahelix.com

9 Jobs

167 Employees

About the Company

ideaHelix is a Silicon Valley-based, Salesforce consulting partner and tools developer delivering proven best practices refined through hundreds of successful projects. Specializing in Revenue Cloud, Salesforce Industries, and emerging Agentforce and AI capabilities, we enable industry-focused customers to optimize, evolve and future-proof their Salesforce deployments.

IdeaHelix customers rely on our sophisticated tools, deep Salesforce alignment and certified global team to identify and eliminate inefficiencies, simplify and refactor business workflows, and streamline and automate managed package to core migrations.

Backed by proprietary tools, and validated by over 400 certifications, our team of domain experts helps customers to maximize the value of their Salesforce investments.

Listed Jobs

Company background Company brand
Company Name
ideaHelix
Job Title
Salesforce Data Cloud Consultant
Job Description
**Job Title**: Salesforce Data Cloud Consultant **Role Summary** Senior consultant responsible for designing, developing, and optimizing Salesforce Data Cloud solutions within a hybrid work environment. Must lead implementation activities, integrate with Sales Cloud and Marketing Cloud, and provide architectural guidance to ensure scalable, high‑performing data integrations. **Expectations** - Adopt a hybrid work mode, with onsite presence for at least 2 days per week. - Deliver hands‑on development and integration of Data Cloud features on a long‑term contract basis, commencing in December or January. - Provide architectural oversight and best‑practice guidance for complex data scenarios across the Salesforce ecosystem. **Key Responsibilities** - Design, build, and configure Data Cloud solutions in alignment with business requirements. - Extend Data Cloud capabilities through custom development, APIs, and connectors to Sales Cloud and Marketing Cloud. - Conduct data modeling, mapping, and transformation to support analytics and integration needs. - Optimize performance, scalability, and security of Data Cloud implementations. - Mentor and collaborate with cross‑functional teams, ensuring knowledge transfer and adherence to architectural standards. - Evaluate and recommend enhancements for data strategy and governance. - Manage project scope, timelines, and deliverables in a contract environment. **Required Skills** - Expertise in Salesforce Data Cloud architecture, configuration, and development. - Demonstrated experience with Sales Cloud and Marketing Cloud integration. - Strong proficiency in data modeling, transformation, and ETL concepts. - Hands‑on experience with Apex, Lightning Web Components, and Salesforce APIs. - Ability to work independently and lead technical decisions. - Excellent communication and stakeholder collaboration skills. **Required Education & Certifications** - Salesforce Data Cloud Certification (CSF-DataCloud). - Salesforce certifications in Sales Cloud (e.g., Salesforce Certified Advanced Administrator or Sales Cloud Consultant) and Marketing Cloud (e.g., Marketing Cloud Administrator). - Bachelor’s degree in Computer Science, Information Technology, or related field preferred.
Oakland, United states
Hybrid
19-11-2025
Company background Company brand
Company Name
ideaHelix
Job Title
Senior Data Engineer (Snowflake, ETL) - Hybrid (Local to Bay area)
Job Description
**Job Title** Senior Data Engineer (Snowflake, ETL) **Role Summary** Design, develop, and maintain cloud‑based data pipelines and storage solutions focused on Snowflake. Lead engineering activities that support analytics products and data products, ensuring high data quality, security, and performance for enterprise stakeholders. **Expectations** - 5 + years of data engineering experience (12 + years preferred) - Senior‑level ownership of end‑to‑end data architecture, from ingestion to analytics layers - Proficient in Snowflake, SQL, and at least one ETL/ELT framework (Informatica, dbt, Talend, or custom) **Key Responsibilities** - Build and optimize Snowflake data warehouses, applying dimensional modeling and performance tuning - Develop, test, and deploy modular ETL pipelines, including stored procedures and reusable scripts - Work with cross‑functional teams to capture data requirements and deliver solutions aligning with business goals - Enforce data governance, quality, and metadata management practices - Monitor pipeline health, troubleshoot issues, and implement corrective actions to ensure reliability - Collaborate on CI/CD workflows and version control of pipeline code (Git) **Required Skills** - Proficiency in Snowflake and SQL - Experience with ETL/ELT tools (Informatica, dbt, Talend, or equivalent) - Cloud platform exposure (AWS, Azure, or GCP) - Knowledge of data warehousing concepts, dimensional modeling, and performance tuning - Familiarity with version control (Git) and CI/CD for data pipelines - Strong analytical, problem‑solving, and written/verbal communication skills - Experience with Python or other scripting languages (preferred) - Basic knowledge of big data technologies (Hadoop, Spark, Kafka) and data visualization tools (Tableau, Power BI) is a plus **Required Education & Certifications** - Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
San francisco bay, United states
Hybrid
Senior
24-11-2025
Company background Company brand
Company Name
ideaHelix
Job Title
Security Engineer (CCNA, CCNP, CCSA, PCNSA, PCNSE, PCCET, CompTIA) - Onsite - San Jose, CA
Job Description
Job Title: Security Engineer – Mergers & Acquisitions Integration Role Summary: Design, deploy, and maintain security appliances and controls to integrate acquired organizations into the enterprise network, ensuring compliance with corporate security standards and business continuity. Expectations: 5+ years of hands‑on security engineering, strong analytical and project management skills, ability to document and communicate complex security solutions to diverse stakeholders, and readiness to work independently or in global teams. Key Responsibilities - Design, configure, and launch perimeter and internal firewalls, EDR, DLP, and other security controls for site‑by‑site acquisitions. - Develop and maintain bill‑of‑materials, order coordination, and inventory for new security hardware. - Patch, register, and integrate new firewalls into central management consoles; deploy rule sets and ensure auditing/logging for SOC. - Conduct security posture assessments, vulnerability scanning, incident analysis, and performance monitoring. - Produce and update detailed diagrams, configuration logs, and compliance documentation. - Collaborate with networking, systems, and application teams on integration projects and capacity planning. - Automate routine scanning, reporting, and vulnerability remediation tasks. - Create phased integration roadmaps (Day 1‑30‑60‑90+) for security uplift and alignment with enterprise architecture. Required Skills - Deep knowledge of network concepts, security appliance configuration (Cisco, Check Point, Palo Alto). - Proficiency with security monitoring, diagnostic, and threat assessment tools. - Cloud networking, virtualization, and security standards expertise. - Strong written and verbal communication, analytical problem‑solving, and time‑management abilities. Required Education & Certifications - Bachelor’s degree in Computer Science, IT, Network Engineering, or related field. - Certifications: CCNA (Network) & CCNP (Security), CCSA, PCNSA, PCNSE, PCCET, CompTIA Security+.
San jose, United states
On site
Mid level
08-12-2025
Company background Company brand
Company Name
ideaHelix
Job Title
Data Engineer (Snowflake) - Only W2 - Remote (PST Hours)
Job Description
**Job title:** Data Engineer (Snowflake) – Remote (PST Hours) **Role Summary:** Design, build, and maintain scalable data pipelines and Snowflake data models to support analytics and business intelligence. Optimize query performance, enforce data quality, security, and governance standards while collaborating with cross‑functional teams. **Expectations:** - 5–7 years of data engineering experience. - Proficiency with Snowflake (data modeling, pipelines, SQL, performance tuning). - Strong SQL and Python coding skills. - Solid grasp of cloud platforms (AWS, GCP, Azure). - Excellent communication, problem‑solving, and teamwork. **Key Responsibilities:** - Develop and optimize ETL workflows for large volumes of data. - Model and maintain Snowflake schemas and tables. - Tune Snowflake queries and storage to achieve optimal performance. - Collaborate with analytics, BI, and application teams to deliver data solutions. - Implement data security, governance, and quality best practices. - Diagnose and resolve pipeline incidents efficiently. - Document and maintain architecture and data flow diagrams. **Required Skills:** - Snowflake (data modeling, pipelines, SQL, performance tuning). - SQL programming. - Python scripting. - Experience with ETL frameworks and data warehousing concepts. - Cloud platform familiarity (AWS, GCP, Azure). - Problem‑solving and communication abilities. **Required Education & Certifications:** - Bachelor’s (or higher) in Computer Science, Engineering, or related field from a reputable U.S. university (preferred). - Certifications in Snowflake, AWS, GCP, or Azure are a plus.
United states
Remote
Mid level
11-12-2025