cover image
ShyftLabs

ShyftLabs

shyftlabs.io

1 Job

141 Employees

About the Company

Here at ShyftLabs, we build data products to help enterprises deliver real impact through tailored data analytics, science, and AI solutions. From consulting to operations, we guide our customers through their data journey and ensure they are data and AI-empowered.

Listed Jobs

Company background Company brand
Company Name
ShyftLabs
Job Title
AI Automation Engineer
Job Description
**Job Title:** AI Automation Engineer **Role Summary:** Design, develop, and deploy AI‑driven automation solutions that improve operational efficiency and decision‑making. Leverage large language models (LLMs), OCR, and NLP to process unstructured data, automate workflows, and provide actionable insights across business functions. **Expactations:** - Deliver scalable AI pipelines using Python (FastAPI, LangChain, spaCy) or Node.js. - Build and maintain OCR‑based extraction and semantic search systems. - Deploy reliable inference services on GCP, AWS, or Docker/Kubernetes. - Monitor model performance and trigger retraining as needed. - Collaborate with data, DevOps, and product teams to meet business and compliance goals. **Key Responsibilities:** - Design and implement end‑to‑end AI automation pipelines. - Develop OCR extraction workflows for invoices, receipts, and other documents. - Create custom AI agents for reconciliation, tagging, and anomaly detection. - Fine‑tune LLMs for document understanding, summarization, and entity matching. - Deploy and orchestrate services on cloud platforms (Vertex AI, AWS Lambda) or containerized environments. - Integrate AI outputs into dashboards, APIs, or automation tools for user consumption. - Set up automated evaluation frameworks to monitor data drift and model degradation. - Work cross‑functionally to align solutions with business objectives and regulatory standards. **Required Skills:** - Proficient in Python or Node.js; experience with FastAPI, LangChain, spaCy. - Strong knowledge of OCR, NLP, semantic search, and LLM‑based automation. - Hands‑on experience deploying on AWS, GCP, Docker, or Kubernetes. - Familiarity with vector databases (e.g., Pinecone, FAISS, Weaviate) and prompt orchestration. - Ability to translate business requirements into scalable technical designs. - Excellent problem‑solving and communication skills. **Required Education & Certifications:** - Bachelor’s or Master’s degree in Computer Science, Data Science, or a related field. - Relevant certifications (e.g., AWS Certified Solutions Architect, Google Cloud Professional Data Engineer) are a plus but not mandatory.
Toronto, Canada
Hybrid
06-03-2026