cover image
Synergy Technologies

Synergy Technologies

www.synergytechs.net

3 Jobs

163 Employees

About the Company

Synergy Technologies: Your Gateway to Unparalleled Business Success! Elevate your business with our cutting-edge IT solutions today! As leaders in global IT solutions, we specialize in crafting impactful, tailor-made technological solutions. With our unique PPP model at the helm, we expertly navigate clients through their business journey. Backed by over 500 skilled professionals, we're dedicated to forging partnerships that harness the full power of IT innovation. Let's propel your business to new heights together!

Listed Jobs

Company background Company brand
Company Name
Synergy Technologies
Job Title
Platform Engineer
Job Description
**Job title:** Platform Engineer **Role Summary:** Design, build, and maintain analytics‑ready data pipelines and platform components, ensuring high data quality, observability, and seamless integration with BI tools. Own senior‑level data engineering responsibilities, delivering scalable solutions that support analytics, reporting, and decision‑making across the organization. **Expectations:** - Deliver end‑to‑end dbt models (staging, intermediate, mart) that meet analytics readiness. - Implement robust data quality and observability frameworks. - Collaborate with cross‑functional teams to prioritize and manage data engineering backlog in an Agile environment. - Communicate progress, risks, and technical decisions clearly to stakeholders. - Provide comprehensive documentation for pipelines, models, and platform usage. **Key Responsibilities:** - Build and maintain Snowflake data warehouse environments and dbt pipelines. - Develop and enforce metrics layers or semantic models for consistent reporting. - Integrate and configure data observability tools (e.g., Monte Carlo, Great Expectations). - Ensure data integrity through automated quality checks and lineage tracking. - Support analytics and BI tools (Power BI, Tableau, Looker, etc.) by exposing clean, governed data marts. - Manage sprint planning, backlog grooming, and issue tracking using JIRA. - Produce and maintain technical and user documentation. - Mentor junior engineers and foster best practices in data engineering. **Required Skills:** - Deep expertise in dbt (advanced level). - Strong Snowflake SnowPro experience. - Proficiency in CI/CD with GitHub Actions and AWS infrastructure. - Knowledge of data observability and quality frameworks. - Familiarity with metrics layers, semantic modeling, and BI integration. - Excellent documentation and stakeholder communication skills. - Agile delivery experience with JIRA and backlog management. **Required Education & Certifications:** - 6+ years in data engineering or analytics engineering with senior ownership. - Bachelor’s or Master’s in Data Engineering, Computer Science, or related field. - Certifications: dbt Advanced, SnowPro, GitHub Actions Certification, AWS Certified Solutions Architect – Professional.
Houston, United states
Remote
Mid level
23-01-2026
Company background Company brand
Company Name
Synergy Technologies
Job Title
Direct Client:: Need Certified Salesforce Technical & Solutions Architect with Copado & Mulesoft Experiences :: Durham, NC (Hybrid)
Job Description
Job title: Certified Salesforce Technical & Solutions Architect (Copado & MuleSoft Experience) Role Summary: Lead design and delivery of secure, scalable Salesforce solutions and API‑based integrations for an Early Education Integration System. Bridge business strategy with enterprise architecture, guide development teams, and ensure alignment with data security, performance, and compliance standards. Expactations: - Minimum 7 years of experience in Salesforce solution architecture, development, and administration. - Proven expertise in Apex, LWC, Aura, Flows, SOQL/SOSL, and platform‑wide configuration. - 5 + years of experience with DevOps, CI/CD, version control, and agile delivery. - Strong knowledge of MuleSoft integration patterns, Copado release management, and data migration/ETL strategies. - Familiarity with Shield, Lightning, mobile solutions, and WCAG 2.1 accessibility compliance. Key Responsibilities: 1. Architect end‑to‑end Salesforce and integration solutions, translating business scope into technical design. 2. Design and lead implementation of complex API‑driven integrations between Salesforce and external systems. 3. Ensure solutions meet data security, performance, backup, and restore standards. 4. Identify and mitigate architectural risks; advise when to use standard Salesforce functionality vs. custom development. 5. Produce high‑quality technical documentation, data models, and solution diagrams. 6. Conduct code reviews, design workshops, and release‑planning sessions. 7. Support Agile teams through backlog refinement, user‑story decomposition, and acceptance‑criteria definition. 8. Assist in Salesforce environment setup, CI/CD pipeline configuration, and release‑management processes. Required Skills: - Salesforce architecture and advanced development (Apex, LWC, Aura, Flows). - Object‑oriented design patterns (Apex, Java, JavaScript, C#, Ruby). - Integration expertise (MuleSoft iX, API design, ETL tools). - DevOps: Copado, GitHub/BitBucket, CI/CD pipelines, version control. - Data migration, backup, restore, and Shield configuration. - Agile methodologies and stakeholder collaboration. - Accessibility laws (WCAG 2.1) and mobile‑first architecture. Required Education & Certifications: - Salesforce Technical Architect / Application Architect certification. - Salesforce Integration Architect certification. - Salesforce Platform Developer II certification. - Salesforce Certified Platform App Builder certification. - Copado certification. - MuleSoft Certified Developer – Integration Architect I & II.
Durham, United states
Hybrid
Senior
06-02-2026
Company background Company brand
Company Name
Synergy Technologies
Job Title
Need AI/ML Architect with Agentic AI, MLOps & Python Experience :: Onsite position:: Locations (GA, TX, IL, NJ, NY)
Job Description
Job Title: AI/ML Architect Role Summary: Senior AI/ML Architect with 10‑12+ years of experience designing, developing, and deploying production‑grade machine learning solutions across fintech and regulated environments. Leads end‑to‑end model lifecycle, from data preparation to scalable deployment, while ensuring compliance, explainability, and robust MLOps practices. Expectations: - Deliver high‑impact AI solutions that integrate classical ML, deep learning, and large language model (LLM) capabilities. - Own the full model ownership cycle: data engineering, feature extraction, model training, evaluation, deployment, monitoring, and retraining. - Work cross‑functionally with engineering, product, data, and compliance teams to meet business and regulatory objectives. - Demonstrate leadership in cloud‑native MLOps, container orchestration, and CI/CD automation. - Champion Responsible AI principles, model explainability, and safety guardrails. Key Responsibilities: 1. Build, train, and evaluate ML and deep learning models for classification, prediction, anomaly detection, and NLP. 2. Implement scalable pipelines: data ingestion, feature engineering, vector embeddings, and inference workflows. 3. Develop and integrate LLM‑based capabilities (embeddings, retrieval‑augmented generation, fine‑tuning). 4. Deploy models with Docker, Kubernetes, and cloud services (Azure, AWS, GCP). 5. Establish MLOps pipelines: CI/CD, experiment tracking, model registry, drift detection, and automated retraining. 6. Create reproducible, versioned data pipelines using Spark or Airflow. 7. Apply explainability (SHAP, LIME) and Responsible AI guidelines to model outputs. 8. Ensure compliance with SOC 2, PCI‑DSS, and other relevant regulatory frameworks. Required Skills: - Deep expertise in supervised/unsupervised ML, deep learning, and NLP. - Advanced Python (NumPy, Pandas, scikit‑learn, PyTorch or TensorFlow). - Proficiency in SQL, Spark, Airflow, and data‑engineering workflows. - Hands‑on with LLMs, vector databases, embeddings, and RAG systems. - Experience deploying models with Docker, Kubernetes, GPU acceleration, and CI/CD pipelines. - Cloud AI tooling: Azure ML, Amazon SageMaker, or GCP AI Platform. - Model monitoring, drift detection, and retraining strategies. - Familiarity with model explainability (SHAP, LIME) and Responsible AI practices. Required Education & Certifications: - Bachelor’s (or Master’s) degree in Computer Science, Data Science, Machine Learning, or related field. - 10‑12+ years of professional experience in AI/ML engineering. - Relevant certifications (e.g., AWS Certified Machine Learning, Azure AI Engineer Associate, Google Cloud Professional Machine Learning Engineer) are preferred although not mandatory.
Dallas, United states
On site
Senior
05-03-2026