Job Specifications
Platform Architect
Toronto ON
Certifications in cloud platforms (AWS/Azure/GCP Data Architect) or data technologies (Snowflake, Databricks).Experience in machine learning platforms, data science workflows, or IoT data architec-tures. Background in data security and compliance implementations. Exposure to business intelligence tools (e.g., Power BI, Tableau, Looker).
Skills Required: Data Architecture and Modelling
Job Description: Design scalable, secure, and high-performance data architectures to support enter-prise data platforms, data lakes, data warehouses, and real-time data processing.
Lead the design and delivery of data integration and analytics solutions, ensuring alignment with business needs and enter-prise architecture.
Define data modelling standards, data flows, and metadata frameworks that ensure data quality, consistency, and compliance.
Collaborate with data engineers, analysts, business stake-holders, and IT teams to translate business requirements into robust data solutions.
Architect solu-tions across structured and unstructured data sources, integrating with on-premises, cloud, and hybrid environments.
Develop and enforce best practices for data security, privacy, and compliance (e.g., GDPR, HIPAA).Work with DevOps and engineering teams to implement data pipelines, APIs, and auto-mation frameworks.
Evaluate emerging data technologies and tools and make recommendations for adoption.
Document data architecture decisions, roadmaps, and governance models.
Bachelors or Master's degree in Computer Science, Data Science, Information Systems, or a related field.
7+ years of experience in data architecture, data engineering, or BI/analytics, including 2+ years in a data solution architect role.
Strong experience with data platforms such as: Cloud: AWS (e.g., Red-shift, Glue), Azure (e.g., Synapse, Data Factory), or GCP (e.g., Big Query, Dataflow)Warehousing: Snow-flake, Databricks, or traditional EDWs (e.g., Teradata, Oracle)ETL/ELT Tools: Informatica, Talend, Apache Nifi, dbt Proficiency in SQL, data modelling (star/snowflake schemas), and data pipeline or-chestration. Familiarity with data governance tools and frameworks (e.g., Collibra, Alation, Apache At-las).
Knowledge of data API integration, streaming technologies (Kafka, Spark Streaming), and batch/real-time processing.