cover image
Archigos Solutions

Archigos Solutions

www.ArchigosSolutions.com

2 Jobs

2 Employees

About the Company

At Archigos, our mission is to provide top-notch IT consulting services that empower businesses to leverage technology effectively. We focus on SAP, Data Analytics, Infra & Security and other global IT solutions to ensure your success. About Us Empowering Businesses Through Smart Technology Solutions At Archigos Solutions we specialize in delivering innovative IT consulting services that drive growth, efficiency, and transformation. With a team of seasoned professionals and a passion for problem-solving, we partner with businesses of all sizes to navigate the complexities of today's digital landscape. Founded on the belief that technology should work for you--not against you--we offer tailored solutions across SAP, Data AI cloud computing, cybersecurity, IT strategy, infrastructure, software development, and more. Whether you're a startup building from the ground up or an enterprise looking to modernize, our mission is to help you leverage technology to achieve your goals.

Listed Jobs

Company background Company brand
Company Name
Archigos Solutions
Job Title
Google Cloud Platform- GCP Data Architect
Job Description
**Job Title:** GCP Data Architect **Role Summary** Design and implement enterprise data solutions integrating Google Cloud Platform (GCP) and SAP systems. Focus on data architecture, governance, pipeline development, and cloud-based system migrations to deliver scalable, secure, and cost-effective data ecosystems. **Expectations** Combine expertise in GCP data platforms, SAP integration, and data engineering to bridge enterprise SAP environments with cloud solutions. Requires proficiency in analytics, system migrations, and DevOps practices for data workflows. **Key Responsibilities** 1. Architect and optimize data strategies, governance frameworks, and compliance policies (GDPR, HIPAA, PIPEDA). 2. Design data models (star, snowflake, data vault) for analytics and operational workloads, including S4 CDS view implementation on BigQuery. 3. Build GCP-based solutions using BigQuery, Dataflow, Dataproc, and Cloud Storage; enforce cost-optimization strategies. 4. Develop ETL/ELT pipelines (Airflow, Dataflow) and integrate data from SAP BW, HANA, and Business Objects using tools like SAP SLT, Google Cortex, or Boomi. 5. Execute migrations of on-premise SAP systems (BW, Bobj) to GCP, ensuring data integrity and minimal downtime. 6. Implement CI/CD pipelines for data workflows via GitHub Actions, Cloud Build, and Terraform using infrastructure-as-code principles. **Required Skills** - GCP expertise: BigQuery, Dataflow, Cloud Storage, Pub/Sub. - SAP: Data extraction, modeling, integration (BW, HANA, ABAP, CDS views). - Data engineering: SQL/Python for analytics and pipeline automation, ETL/ELT orchestration. - Integration tools: Apache Airflow, Boomi, SAP SLT, or MuleSoft. - Data governance: Frameworks, security policies, and compliance strategies. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Engineering, or related field. - Relevant Google Cloud certifications (e.g., Google Cloud Data Engineer, Data Architect).
Toronto, Canada
Remote
19-10-2025
Company background Company brand
Company Name
Archigos Solutions
Job Title
Google Cloud Platform (GCP) Data Lead
Job Description
**Job Title:** Google Cloud Platform (GCP) Data Lead **Role Summary:** Lead the design, implementation, and governance of enterprise‑grade data solutions on GCP, integrating SAP data from ERP, BW, and HANA systems. Manage a small data engineering team, set technical standards, and collaborate with stakeholders to deliver scalable, cost‑effective analytics platforms. **Expectations:** - Own end‑to‑end SAP‑to‑GCP data migration and orchestration. - Mentor and grow a data engineering squad while maintaining delivery quality and timelines. **Key Responsibilities:** - Mentor and guide a 2–5 member data engineering team. - Define and enforce engineering standards, coding guidelines, and best practices. - Design conceptual, logical, and physical models (star, snowflake, data vault). - Architect and build ETL/ELT pipelines with Cloud Composer (Airflow), Dataflow, and Dataproc. - Extract, transform, and load data from SAP BW, HANA, and Business Objects using SAP SLT or Google Cortex Framework; integrate via Boomi as needed. - Develop complex SQL queries and Python scripts for analytics, automation, and performance tuning. - Lead on‑prem to cloud migrations of SAP datasets, ensuring data integrity and minimal downtime. - Implement CI/CD for data workflows using GitHub Actions, Cloud Build, Terraform, and IaC. - Optimize GCP resource utilization and cost structure. - Conduct code reviews, technical guidance, and blockers removal with project managers and BI teams. **Required Skills:** - 8+ years in GCP (BigQuery, Composer, Cloud Storage, Pub/Sub, Dataflow). - 2–3 years of team leadership/mentoring. - Advanced SQL and Python programming. - SAP data extraction, modeling, and integration expertise. - Knowledge of data governance frameworks and security best practices. - Familiarity with dev‑ops for data (CI/CD, IaC). - Experience with Google Cortex Framework for SAP‑GCP integrations. **Required Education & Certifications:** - Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. - Google Cloud Certified – Professional Data Engineer (preferred). - SAP integration or data modeling certifications (advantage).
Toronto, Canada
Hybrid
Senior
30-11-2025