Job Specifications
Google Cloud Platform (GCP) Data Lead
Location: Toronto ON(Hybrid)
12 months
Role Overview
We are seeking a highly skilled Google Cloud Platform (GCP) Data Lead with strong SAP data integration expertise to design, implement, and oversee enterprise-grade data solutions. The ideal candidate will combine deep expertise in cloud data platforms, data governance, security, and data modeling with hands-on experience in ETL/ELT pipelines, SAP data extraction, system migrations, and analytics. This role will collaborate with business stakeholders, and engineering teams to create a robust, scalable, and cost-effective data ecosystem that bridges SAP and GCP environments.
Skills
8+ years of proven experience with GCP BigQuery, Composer, Cloud Storage, Pub/Sub, Dataflow.
Min 2 – 3 years of leadership experience in mentoring small to mid size data engineering team.
Strong SQL and Python programming skills.
Hands-on experience with SAP data extraction, modeling, and integration from ERP, BW, and/or HANA systems.
Knowledge of data governance frameworks and security best practices.
Familiarity with DevOps tools for data.
Understanding of Google Cortex Framework for SAP-GCP integrations.
Key Responsibilities
1. Mentoring
Lead and mentor a team of data engineers in building ETL/ELT pipelines for SAP and other ERP sources into GCP
Set engineering standards, best practices, and coding guidelines.
· Provide technical direction, code reviews, and support for complex data solutions.
Collaborate with project managers, provide the estimates, track the progress, remove roadblocks to ensure timely completion of work.
Collaborate with BI teams, Data analyst to enable reporting solution.
2. Data Architecture & Modeling
Design conceptual, logical, and physical data models to support analytics and operational workloads.
Implement star, snowflake, and data vault models for analytical systems.
3. Google Cloud Platform Expertise
Design data solutions on GCP using BigQuery, Cloud Storage, Dataflow, and Dataproc.
Implement cost optimization strategies for GCP workloads.
4. Data Pipelines & Integration
Design and orchestrate ETL/ELT pipelines using Apache Airflow (Cloud Composer), and Dataflow
Integrate data from multiple systems including SAP BW, SAP HANA, Business Objects using tools like SAP SLT, or Google Cortex Framework.
Leverage integration tools such as Boomi for system interoperability.
5. Programming & Analytics
Develop complex SQL queries for analytics, transformations, and performance tuning.
Build automation scripts and utilities in Python.
6. System Migration
Lead on-premise to cloud migrations for enterprise data platforms [SAP BW/Bobj]
Manage migration of SAP datasets to GCP ensuring data integrity and minimal downtime.
8. DevOps for Data
Implement CI/CD pipelines for data workflows using GitHub Actions, Cloud Build, and Terraform.
Apply infrastructure-as-code principles for reproducible and scalable deployments.
About the Company
At Archigos, our mission is to provide top-notch IT consulting services that empower businesses to leverage technology effectively. We focus on SAP, Data Analytics, Infra & Security and other global IT solutions to ensure your success. About Us Empowering Businesses Through Smart Technology Solutions At Archigos Solutions we specialize in delivering innovative IT consulting services that drive growth, efficiency, and transformation. With a team of seasoned professionals and a passion for problem-solving, we partner with busi...
Know more