cover image
TechDoQuest

TechDoQuest

www.techdoquest.com

4 Jobs

91 Employees

About the Company

TechDoQuest is a modern IT consulting and delivery partner helping businesses scale faster with smarter technology and global execution. With operations across Canada, the U.S., and India, we specialize in: IT Consulting & Advisory Custom Software Development Cloud & DevOps Services Building and Managing Global Capability Centers (GCCs) At TechDoQuest, we combine strategic insight with hands-on execution -- delivering lean, cost-conscious, and scalable solutions that drive measurable business outcomes. Whether you're a startup seeking tech acceleration or an enterprise optimizing global delivery, we bring the right talent, tools, and technology to make it happen. Let's build Smarter. Globally. Together.

Listed Jobs

Company background Company brand
Company Name
TechDoQuest
Job Title
Technical Lead- Azure Data Engineer
Job Description
Job Title: Technical Lead – Azure Data Engineer Role Summary: Lead the design, development, and maintenance of end‑to‑end data solutions on Microsoft Azure, ensuring high‑performance, scalable pipelines and data warehouses while mentoring and guiding a team of data engineers. Expactations: - Minimum 5 years of professional data engineering experience, with at least 2 years in a technical leadership role. - Extensive hands‑on experience with Azure Data Factory, Azure Databricks, Azure Data Lake Storage, Azure SQL Data Warehouse, and Synapse Analytics. - Strong SQL and relational data‑modeling skills. - Proficiency in Python or Scala for data transformation and processing pipelines. - Demonstrated experience building and maintaining BI reports using Power BI or Tableau. - Ability to translate business requirements into robust, cost‑effective Azure data architectures. Key Responsibilities: - Design, build, and optimize Azure‑native data pipelines (ADF, Databricks, Data Lake). - Develop and maintain Synapse workspaces and Azure SQL Data Warehouse schemas. - Implement, test, and monitor data quality, latency, and reliability. - Provide technical guidance, code reviews, and mentorship to junior engineers. - Collaborate with data scientists, analysts, and stakeholders to define KPIs and reporting needs. - Drive continuous improvement of data engineering processes, tooling, and best practices. - Manage security, governance, and compliance requirements for data assets. Required Skills: - Azure Data Factory, Azure Databricks, Azure Data Lake Storage, Synapse Analytics, Azure SQL Data Warehouse. - SQL (T‑SQL, U-SQL) and relational data‑modeling. - Python or Scala for ETL/ELT pipelines. - BI reporting with Power BI, Tableau, or similar tools. - Strong understanding of data integration patterns, data governance, and DevOps practices on Azure. - Excellent problem‑solving, communication, and teamwork abilities. Required Education & Certifications: - Bachelor’s degree in Computer Science, Information Systems, or related field (Masters preferred). - Microsoft Certified: Azure Data Engineer Associate (DP‑203; DCA‑103) or equivalent. - Preferred: Azure Solutions Architect Expert or Azure DevOps Engineer certifications.
Brampton, Canada
On site
Senior
29-10-2025
Company background Company brand
Company Name
TechDoQuest
Job Title
Data Engineer
Job Description
**Job title:** Data Engineer – Process Mining **Role Summary:** Data Engineer responsible for extracting, transforming, and loading data from cloud and on‑premise systems (Azure, Oracle, SAP, Salesforce, etc.) into Celonis Process Mining. Validates data integrity, designs KPI dashboards, and develops automated anomaly detection workflows to support continuous process improvement. **Expactations:** - Act as a technical bridge between source systems and process mining platform. - Deliver accurate, high‑quality data pipelines on schedule. - Collaborate with business, analytics, and IT stakeholders to define priorities and translate requirements into data solutions. - Provide actionable insights that drive process optimization and efficiency. **Key Responsibilities:** - Design and maintain ETL/ELT pipelines for multiple data sources. - Validate source‑to‑target data consistency (counts, values). - Create KPIs, dashboards, and reports (Tableau, Power BI) aligned with business goals. - Develop workflows in Celonis Action Engine to trigger real‑time preventive or corrective actions. - Map and model process flows, identifying non‑value‑added steps for lean improvement. - Monitor processes, detect anomalies, and recommend corrective actions. - Work with ERP/CRM systems perspective (SAP, Oracle, Dynamics, Salesforce). - Participate in cross‑functional meetings to gather requirements and deliver technical solutions. **Required Skills:** - 1–2 years Celonis Process Mining experience (Data Engineer/Data Scientist). - ETL/ELT proficiency; SQL/PQL (joins, unions, window functions). - BI tools: Tableau, Power BI. - Basic Python scripting (NumPy, pandas, matplotlib, scikit‑learn). - Process improvement methodology (Lean, Six Sigma basics). - Strong analytical, problem‑solving, and communication skills. - Self‑starter ability to learn new technologies and support global rollouts. **Required Education & Certifications:** - Bachelor’s degree in Computer Science, Information Technology, or related field (or equivalent experience). - Certifications in SQL, ETL, or Process Mining (e.g., Celonis Advanced data engineer certification) are a plus.
Canada
Remote
Fresher
08-12-2025
Company background Company brand
Company Name
TechDoQuest
Job Title
OpentextExtream Developer
Job Description
Job title: OpentextExstream Developer Role Summary: Senior consultant responsible for designing, developing, and maintaining OpenText Exstream solutions. Provides best‑practice guidance for technical and functional upgrades, migrations, and enhancements, ensuring high‑quality, high‑volume customer communications in a Cloud‑Native environment. Expectations: - Deliver end‑to‑end Exstream projects on time and within scope. - Act as subject‑matter expert, driving architecture and design decisions. - Ensure robust test coverage and efficient production support. - Collaborate closely with business and technical stakeholders to translate requirements into technical specifications. Key Responsibilities: - Advise on best practices for OT Exstream technical and functional changes, upgrades, and migrations. - Design and develop Exstream templates for high‑volume customer communications. - Implement enhancements and support production documents. - Gather and analyze requirements from business and technical teams. - Perform unit testing, troubleshooting, and issue resolution in production environments. - Deploy and maintain Exstream as a Cloud‑Native solution, leveraging containerization technologies (Docker, Kubernetes). Required Skills: - 9+ years of hands‑on experience with OpenText Exstream (v16 or higher). - Proven ability to set up OT Exstream deployment in a Cloud‑Native architecture. - Strong understanding of containerization concepts, Docker, and Kubernetes. - Expertise in Composition Center and Empower modules. - Proficient in XML, XSLT, and data mapping. - Experience with batch and interactive processing workflows. - Excellent problem‑solving and communication skills. Required Education & Certifications: - Bachelor’s degree in Computer Science, Information Technology, or related field. - OpenText Exstream certification preferred (e.g., OpenText Exstream Developer).
Toronto, Canada
On site
Senior
20-01-2026
Company background Company brand
Company Name
TechDoQuest
Job Title
Site Reliability Engineer (SRE)
Job Description
**Job Title:** Site Reliability Engineer (SRE) **Role Summary:** Responsible for ensuring the reliability, scalability, and performance of production systems. Encapsulates end‑to‑end automation, monitoring, incident response, and continuous improvement across cloud and container platforms. **Expectations:** - 3+ years in SRE, DevOps, or a related reliability role. - Proven experience with multi‑cloud (AWS, Azure, GCP) and Kubernetes‑based environments. - Demonstrated ownership of incident management, root‑cause analysis, and post‑mortem processes. **Key Responsibilities:** - Design, implement, and maintain infrastructure automation using Ansible, Terraform, or equivalent. - Build and operate end‑to‑end monitoring, observability, and alerting with Dynatrace, Moogsoft, Elastic Stack. - Enhance CI/CD pipelines (Jenkins, GitHub Actions, UrbanCode Deploy) and deployment automation via Helm, Docker, or Kubernetes manifests. - Lead incident response, conduct root‑cause analysis, and drive corrective action cycles. - Manage cloud resources, container orchestration, and distributed systems at scale; perform capacity planning and load testing. - Enforce security, compliance, and governance (IAM, encryption, Vault, SOC 2, etc.). - Document runbooks, SLOs/SLAs, and operational playbooks. **Required Skills:** - **Automation & Scripting:** Ansible, Python, PowerShell. - **Monitoring & Observability:** Dynatrace, Moogsoft, Elastic Stack (Elasticsearch, Logstash, Kibana). - **Incident Management:** ServiceNow ticketing, CMDB. - **Cloud Platforms:** AWS, Azure, or GCP (compute, storage, serverless, networking). - **Container & Orchestration:** Kubernetes/OpenShift, Docker, Helm. - **Databases & Storage:** SQL Server, NoSQL (Cassandra, Redis) with replication/HA. - **Security & Compliance:** IAM, encryption, Vault, vulnerability scanning, SOC 2. - **CI/CD & DevOps:** Jenkins, GitHub Actions, UrbanCode Deploy, Artifactory/Nexus, Git branching strategies. - **Performance Engineering:** JMeter load testing, capacity planning, SLI/SLO/SLI definition. **Required Education & Certifications:** - Bachelor’s degree in Computer Science, Engineering, or equivalent. - Optional certifications: AWS Certified Solutions Architect, Azure Solutions Architect, GCP Professional Cloud Architect, Certified Kubernetes Administrator (CKA), Ansible Tower Certified, HashiCorp Certified: Terraform Associate, ITIL Foundation.
Toronto, Canada
On site
22-01-2026