Job Specifications
Choose a workplace that empowers your impact.
Join a global workplace where employees thrive. One that embraces diversity of thought, expertise and experience. A place where you can personalize your employee journey to be -- and deliver -- your best.
We are a purpose-driven, dynamic and sustainable pension plan. An industry leading global investor with teams in Toronto to London, New York, Singapore, Sydney and other major cities across North America and Europe. We embody the values of our 600,000+ members, placing their best interests at the heart of everything we do.
Join us to accelerate your growth & development, prioritize wellness, build connections, and support the communities where we live and work.
Don't just work anywhere -- come build tomorrow together with us.
Know someone at OMERS or Oxford Properties? Great! If you're referred, have them submit your name through Workday first. Then, watch for a unique link in your email to apply.
We are seeking a highly skilled and motivated BI Engineer (Power BI Developer) to join our BI Platform team in Toronto. The ideal candidate will have at least 5 years of hands-on experience in data analysis, leveraging tools such as Power BI, Microsoft Fabric Notebooks (Python), Lakehouse architecture, Data Warehousing concepts, and Gen2 Data Flows to deliver advanced analytics, develop insightful dashboards, and create impactful reports that drive data-informed decisions. Along with having expertise with these tools, the ideal candidate will have experience in using GitHub CI/CD Pipelines to promote Power BI reports, ALM Toolkit, and Tabular Editor.
This is a fantastic opportunity for a BI Engineer to work with cutting-edge data analytical tools while being part of a dynamic BI Platform team that fosters collaboration, encourages innovation, and supports professional growth.
As a member of this team, you will be responsible for:
Data Collection & Integration
Ingest data from databases, APIs, and external systems into the Microsoft Fabric Lakehouse using Gen2 Dataflows (Power Query) for standard connectors and Fabric Notebooks (Python) for API orchestration and complex logic.
Design and maintain medallion (bronze/silver/gold) layers in the Lakehouse with Delta/Parquet storage; model conformed dimensions and facts per data warehousing best practices.
Implement incremental refresh/CDC where applicable and wire up Power BI semantic models (Import/Direct Lake) to curated Lakehouse/DWH tables for high-performance analytics.
Ensure lineage and discoverability by publishing certified datasets and documenting data contracts and dependencies across dataflows, notebooks, and models.
Data Cleaning and Preprocessing
Clean and preprocess raw data to remove errors, inconsistencies, or duplicate entries.
Transform data into a structured format suitable for analysis, applying necessary formatting or aggregation techniques.
Validate data to ensure its accuracy and integrity before performing analysis.
Data Analysis & Interpretation
Analyze data to identify trends, patterns, and relationships that can provide insights.
Use statistical methods and predictive models to address business problems and support decision-making.
Perform descriptive and exploratory analysis to understand key metrics and KPIs.
Reporting & Visualization
Create reports, dashboards, and visualizations to communicate data insights clearly and effectively using Power BI.
Automate recurring reports to improve efficiency and ensure timely delivery of insights.
Present findings to stakeholders with clear, actionable insights through reports and presentations.
Collaboration & Communication
Partner with business, product, and engineering teams to translate requirements into Power BI semantic models, curated datasets, and well-defined KPIs.
Use GitHub CI/CD pipelines to version and promote Power BI reports, datasets, dataflows, and notebooks; participate in PR reviews and enforce branching standards.
Leverage ALM Toolkit for model diff/deployment and Tabular Editor for scripting, calculation groups, and documentation to keep environments consistent across dev/test/prod.
Present insights and trade-offs clearly to non-technical stakeholders; maintain concise runbooks and wikis for self-serve analytics and operational support.
Continuous Learning and Improvement
Identify opportunities to improve existing data processes and analytical techniques.
Keep up with the latest trends and best practices in data analysis, visualization, and tools.
Experiment with new methods and tools to enhance the quality and speed of data analysis.
Data Governance & Quality Assurance
Ensure data security, privacy, and compliance with regulations during the collection and analysis process.
Implement processes for quality control to ensure the accuracy and reliability of data.
Monitor and audit data sources for consistency and potential issues over time.
Ad-Hoc SQL Query Development & Analysis
Develop SQL Queries & Conduct ad-h
About the Company
Founded in 1962, OMERS is one of Canada's largest defined benefit pension plans, with $138.2 CAD billion in net assets as at Dec 31, 2024. With employees in our offices in Toronto, London, New York, Amsterdam, Luxembourg, Singapore, Sydney and other major cities across North America and Europe, OMERS invests and administers pensions for over half a million active, deferred and retired employees of 1,000 municipalities, school boards, libraries, police and fire departments, and other local agencies in communities across Ontar...
Know more