cover image
Sigmaways Inc

Sigmaways Inc

www.sigmaways.com

6 Jobs

77 Employees

About the Company

We are one of the region's fastest-growing, multi-award-winning full-lifecycle product engineering service providers. We collaborate with businesses to deliver talent, products, and services faster. Since 2006, we have partnered with pioneering start-ups, innovative enterprises, and the world's largest technology brands. We have utilized our fine-tuned product engineering processes to develop best-in-class solutions for customers in technology, e-commerce, retail, financial services, banking, and consumer products sectors across North America, Europe, and Asia.

We are thrilled to be recognized by multiple media outlets as a fast-growing private company:
#19 in Silicon Valley (Silicon Valley Business Journal) http://bizj.us/ti1ad/i/10
#79 in the Bay Area (San Francisco Business Times) http://bizj.us/th0lj/i/22
#814 in the US (Inc Magazine) http://inc.com/profile/sigmaways

Listed Jobs

Company background Company brand
Company Name
Sigmaways Inc
Job Title
Senior DevOps Engineer
Job Description
**Job Title** Senior DevOps Engineer **Role Summary** Design, implement, and maintain cloud‑based data and development platforms using AWS, Databricks, Immuta, Starburst, and Collibra. Lead migration of data tooling to the cloud, optimize operational efficiency, reduce costs, and enable data‑driven insights for the organization. **Expectations** - Minimum 5 years of technical DevOps experience, with at least 3 years managing complex systems. - Proven expertise in AWS data infrastructure, CI/CD pipeline architecture, and data governance tools. - Strong scripting and automation skills on Linux platforms. - Ability to coach developers, collaborate cross‑functionally, and enforce security and operational best practices. **Key Responsibilities** - Define, implement, and document CI/CD best practices for cloud‑based data platforms. - Configure and maintain AWS data services (S3, Glue, EMR), Databricks, Starburst, Immuta, and Collibra. - Develop and maintain scripts (Python, Bash) to automate deployment, monitoring, and data workflows. - Evaluate and recommend new tools, technologies, and processes to improve speed, efficiency, and scalability. - Coach application developers on building scalable, maintainable applications. - Conduct spot checks, provide guidance to product and operations teams, and ensure adherence to security standards. - Manage cross‑platform dependencies and ensure technology solutions align with business strategies. - Support data access control and governance through Immuta and Collibra integration. **Required Skills** - AWS cloud services (IAM, S3, Glue, EMR, CodePipeline). - Databricks (or equivalent data processing platform). - Starburst (distributed SQL engine). - Collibra (data governance) & Immuta (access control). - CI/CD tooling: Jenkins, Maven, Git, BitBucket, AWS CodeCommit. - Artifact repositories: Nexus, Artifactory. - Linux system administration and strong scripting (Python, Bash). - Agile project tools (Jira, VersionOne). - Automation, testing, and operational best practices. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Information Systems, or related technical discipline (or equivalent experience). - Relevant certifications (e.g., AWS Certified Solutions Architect, AWS Certified DevOps Engineer, Databricks Certified).
San francisco bay, United states
Hybrid
Senior
29-10-2025
Company background Company brand
Company Name
Sigmaways Inc
Job Title
Information Security Analyst
Job Description
Job Title: Information Security Analyst Role Summary: Analyze, detect, and respond to cybersecurity threats within a SOC environment using Splunk, SQL, and programming skills. Expectations: - Maintain up‑to‑date proficiency in Splunk and SPL for event monitoring. - Demonstrate proactive threat identification and mitigation. - Communicate findings clearly to technical and non‑technical stakeholders. Key Responsibilities: - Monitor and analyze security events with Splunk and SIEM tools. - Craft SPL queries for dashboards, alerts, and advanced data analysis. - Drive incident response aligned with the cyber kill chain methodology. - Develop, maintain, and automate scripts/tools in Python or R. - Query multi‑source datasets with SQL to uncover anomalies. - Produce reports, dashboards, and slide decks using Excel and PowerPoint. - Collaborate with Level 1 and Level 2 SOC teams on threat investigation and remediation. - Present findings during briefings and technical discussions. Required Skills: - Splunk (SPL) – advanced analytics and alerting. - Experience with SOC operations and incident response. - Programming: Python, R, and SQL for data extraction and automation. - Data analytics knowledge for multi‑source data investigation. - Proficiency in Microsoft Office (Excel, PowerPoint). - Strong analytical, troubleshooting, and communication skills. - US citizenship and ability to obtain/maintain high background investigation clearance. Required Education & Certifications: - Bachelor’s degree in Cybersecurity, Information Security, Computer Science, or related field. - Minimum 3 years equivalent experience in information security or cybersecurity. - Eligible for or holding a US Secret or higher security clearance.
San francisco bay, United states
Hybrid
Junior
04-11-2025
Company background Company brand
Company Name
Sigmaways Inc
Job Title
Cybersecurity risk security analyst
Job Description
**Job Title:** Cybersecurity Risk Security Analyst **Role Summary:** Analyze, assess, and mitigate security risks in bank systems, processes, and cloud environments. Provide guidance on security policies, advise stakeholders, and develop remediation strategies to strengthen overall security posture. **Expectations:** * Deliver timely risk analyses, remediation plans, and policy guidance. * Communicate findings to technical teams and executives with clarity. * Manage multiple projects in a dynamic, fast‑paced setting. **Key Responsibilities:** 1. Identify and mitigate security risks across bank systems and processes. 2. Interpret and apply security policies; recommend enhancements. 3. Advise business and technical partners on controls, procedures, and best practices. 4. Assess cloud (Azure, AWS) and on‑prem environments; recommend risk‑reducing controls. 5. Conduct security control assessments, document findings, and design actionable remediation plans. 6. Evaluate third‑party vendors for shared security responsibilities and associated risks. 7. Communicate risks and mitigation strategies to all stakeholder levels. 8. Collaborate cross‑functionally to promote secure operations and achieve measurable outcomes. **Required Skills:** * Kusto Query Language (KQL) proficiency for data analysis, log correlation, and threat detection. * Strong knowledge of NIST, ISO 27001, and FedRAMP frameworks. * Experience assessing and improving security posture in cloud and on‑prem environments. * Ability to conduct comprehensive security control assessments and develop remediation plans. * Excellent written and verbal communication; translate technical concepts into business insights. * Strong collaboration, interpersonal, and stakeholder‑management skills. * Organizational acumen and analytical thinking for managing multiple initiatives. **Required Education & Certifications:** * Bachelor’s degree in Cybersecurity, Information Security, Computer Science, or related technical field (or equivalent experience). * Minimum 3 + years of experience in cybersecurity, information security, or technology risk management. * (Optional but preferred) Certifications such as CISSP, CISA, or equivalent security certifications.
San francisco bay, United states
On site
Junior
04-11-2025
Company background Company brand
Company Name
Sigmaways Inc
Job Title
Senior Big Data Engineer
Job Description
**Job Title:** Senior Big Data Engineer **Role Summary:** Design, build, and maintain scalable, next‑generation data platforms using Spark, Hadoop, Hive, Kafka, and cloud services (AWS, Azure, Snowflake). Develop robust pipelines, optimize big data ecosystems, and collaborate across product, engineering, and data science teams to deliver actionable business insights. **Expectations:** - Minimum 7 years of experience designing, developing, and operating Big Data platforms including Data Lakes, Operational Data Marts, and Analytics Data Warehouses. - Bachelor’s degree in Computer Science, Software Engineering, or a related discipline. - Proven proficiency in Spark, Hadoop, Hive, Kafka, and distributed data ecosystems. - Strong background in ETL pipeline development with Hive, Spark, EMR, Glue, Snowflake, Cloudera/MR, NiFi. - Solid understanding of SQL databases (PostgreSQL, MySQL/MariaDB). - Deep knowledge of AWS and Azure cloud infrastructure, distributed systems, and reliability engineering. - Experience with IaC and CI/CD (Terraform, Jenkins, Kubernetes, Docker). - Good programming skills in Python and shell scripting. **Key Responsibilities:** - Design, develop, and support end‑to‑end data applications and platforms focused on Big Data/Hadoop, Python/Spark, and related technologies. - Collaborate with leadership to conceptualize next‑generation data products and contribute to the overall technical architecture. - Work closely with product management, business stakeholders, engineers, analysts, and data scientists to engineer solutions that meet business needs. - Own components from inception through production release, ensuring quality, security, maintainability, and cost‑effectiveness. - Recommend and enforce software engineering best practices with enterprise-wide impact. - Lead continuous process improvements, troubleshoot production issues, and mentor peers on best practices. - Stay current with emerging technologies and rapidly adopt new tools and approaches. **Required Skills:** - Expertise in Spark, Hadoop/MR, Hive, Kafka, and distributed data ecosystems. - Hands‑on experience building ingestion, validation, transformation, and consumption pipelines using Hive, Spark, EMR, Glue ETL/Catalog, Snowflake, Cloudera/Hadoop, NiFi. - Strong SQL skills and experience with PostgreSQL, MySQL/MariaDB. - Deep knowledge of AWS and Azure cloud services (compute, storage, networking, IAM, security). - Proficiency with infrastructure-as-code (Terraform) and CI/CD pipelines (Jenkins). - Containerization and orchestration skills (Docker, Kubernetes). - Familiarity with REST APIs, data integration patterns, and microservices. - Excellent programming skills in Python and shell scripting. - Understanding of distributed systems, reliability engineering, and production best practices. **Required Education & Certifications:** - Bachelor’s degree in Computer Science, Software Engineering, or related field. - Professional certifications (e.g., AWS Certified Big Data – Specialty, Azure Data Engineer Associate) are a plus but not mandatory.
San francisco bay, United states
Hybrid
Senior
09-11-2025