cover image
Q1 Technologies, Inc.

GCP Big Data Engineer- 3 days a week Onsite at Kitchener, Ontario

Hybrid

Kitchener, Canada

Freelance

05-03-2026

Share this job:

Skills

Big Data Data Engineering CI/CD Monitoring Architecture git Google Cloud Platform Analytics GCP Hadoop CI/CD Pipelines Kafka Terraform Infrastructure as Code

Job Specifications

Role: GCP Big Data Engineer

Experience Required: 8–10 years

Primary Skillset: Big Data & Hadoop Ecosystems, Google Data Engineering

Location: Kitchener, Ontario- 3 days a week Onsite

Overview

We are seeking an experienced GCP Big Data Engineer to design, build, and optimize scalable, secure, and high‑performance data pipelines across the Google Cloud Platform (GCP). This role requires expertise in modern data engineering tools including BigQuery, Dataflow, Dataproc, Cloud Composer, Pub/Sub, and strong collaboration with cross‑functional teams to deliver end‑to‑end data solutions.

Key Responsibilities:

Data Pipeline Design & Development

Architect and build scalable, secure, and high‑performance data pipelines on GCP.

Design and optimize ETL/ELT workflows using Cloud Composer, Dataflow, Dataproc, and BigQuery.

Implement data ingestion frameworks supporting batch and streaming pipelines via Pub/Sub, Kafka, and Dataflow.

BigQuery Data Modeling & Optimization

Model, partition, cluster, and optimize BigQuery datasets for analytics workloads.

Improve query performance, storage efficiency, and reliability for large‑scale data environments.

Collaboration & Solution Delivery

Work closely with data scientists, solution architects, and business stakeholders to build end‑to‑end data solutions.

Translate business requirements into scalable technical designs.

Data Quality, Monitoring & Automation

Establish monitoring, validation, and automated checks to ensure data quality and reliability.

Build automated workflows to improve pipeline resilience and reduce manual intervention.

CI/CD & Infrastructure Automation

Implement CI/CD pipelines for data workflows using Cloud Build, Git, and Terraform.

Automate infrastructure provisioning and deployment using Infrastructure as Code (IaC) best practices.

Performance, Cost, & Security Management

Optimize cost efficiency, scalability, and performance across GCP data services.

Apply GCP security best practices, enforce IAM policies, and ensure compliance with organizational standards.

Required Skills & Expertise:

Strong experience with Big Data & Hadoop ecosystems.

Hands‑on expertise with GCP tools: BigQuery, Dataflow, Dataproc, Cloud Composer, Pub/Sub.

Deep knowledge of ETL/ELT development, pipeline architecture, and performance optimization.

Experience with real‑time and batch data processing (Pub/Sub, Kafka, Dataflow).

Proficiency in Terraform, Git, CI/CD, Cloud Build for automation.

Strong understanding of data modeling, partitioning, clustering, and query optimization.

Experience ensuring data security, governance, and compliance on GCP.

About the Company

Q1 is a professional organization that delivers quality products and services. Q1 specializes in Software Development, Business Consulting and Technology Integration. We provide end-to-end integrated solutions that include professional services, functional and technical support and ongoing maintenance using our on-site, off-site and off-shore resources. We offer a comprehensive range of managed services for enterprise business and technology solutions with a team of highly experienced professionals. Q1 is also a Value Added... Know more