Next Horizon is here. Fueled by investments in talent and technology, our bold strategy to transform is nearly complete.
At Gore Mutual, we've always set ourselves apart as a modern mutual that does good. Now, we're proudly building on that legacy to transform our company—and our industry—for the better.
Our path forward sharpens our focus on business performance, driven by leading technology, innovation and an agile, high-performing culture. With Gore Mutual and Beneva announcing their intent to merge in 2026, we'll be uniting two well-established, financially strong, and trusted brands to become the strongest mutual insurer in Canada, ensuring Canadians have purpose-driven insurance options for generations to come. Come join us.
Our Data Science practice at Gore Mutual Insurance is expanding to accelerate our move towards becoming a truly data-driven and digitally led company. To continue our journey forward, we are looking to onboard a Senior Data Scientist to help us drive value from data through the development of ML and AI algorithms, models and pipelines.This role will be responsible for development of efficient reusable pipelines to ingest data, construct features, develop algorithms, and deploy models for driving value from data.
What will you do?
Generate value through algorithm / model driven insight from data
- Understanding of different algorithm types and their application (supervised classification, regression, unsupervised, reinforcement etc)
- Understanding of different modelling architectures, strengths and weaknesses (e.g., gradient boosting, clustering, SHAP, LLMs, autoencoders)
- Experience in design and practical application of AI algorithms in a business setting, working with business users and ensuring that models are built suitably and applied
- Understanding of different optimization techniques (linear optimization, integer optimization, dynamic programming etc)
- Experience in design and practical application of optimization solutions in a business setting given certain constraints
Generate value through data ingestion, model deployment and validation
- Pull data from various systems using SQL pyspark and other standard languages for relational and distributed databases and set up data ingestion pipelines to third party sources via API etc
- Work with engineering partners to construct automated data pipelines for continuous delivery of data to models
- Deploy machine learning models into production environments (e.g., implementing continuous integration and delivery (CI/CD) pipelines for automated model deployment, applying MLOps practices to maintain the lifecycle of machine learning models through testing and validation
Generate value through enhanced feature engineering of data
- Work with Ops based systems for feature stores / automation of feature development (MLFlow, databricks, etc)
- Understand feature transformation techniques to extract maximal value from data with respect to different algorithm type
Generate value through the integration of machine learning and AI algorithms into core pricing and underwriting functions
- Understand approaches in actuarial science for loss and pricing estimations (e.g., GLM, GAM, tweedie distribution, link functions)
- Understand pricing, reserving and regulatory functions in property and casualty insurance
- Understand and have experience with actuarial software tools that support decision making processes within insurance (e.g. Earnix and Guidewire)
Interact with business stakeholders to ensure value and validity of proposed solutions
- Communication with business stakeholders to understand requirements
- Understand validity of algorithmic & optimization solutions With respect to business constraints and value
- Clear communication, both written and oral, for dissemination of results to business stakeholders
What will you need to succeed in this role?
- Master's degree in technical discioline or equivalent (PhD preferred but not required)
- 3 to 5 year in a ML/AI development role
- Strong understanding of business context for advanced models, with experience deploying models that are utilized to generate value. Deployment and utilization is key experience.
- Strong coding experience with python and familiarity with machine learning packages and libraries (e.g., TensorFlow/ Keras, Scikit-Learn, PyTorch/FastAI)
- Familiarity with cloud technologies (e.g., Azure, MLflow, AWS)
- Wide understanding of ML architectures (e.g., GANs, LLMs, Reinforcement Learners)
- Strong communication skills to effectively collaborate and present insights with other team members
- Experience leveraging visualization technologies to interpret complex data, create insightful dashboards, and present findings in a clear and impactful manner (e.g., PowerBI, matplotlib, seaborn, plotly, ggplt, geoplotlib)
- Experience with deployment of machine learning models into production environments (e.g., implementing continuous integration and delivery (CI/CD) pipelines for automated model deployment, applying MLOps practices to maintain the lifecycle of machine learning models, model monitoring, and model performance metrics.
- Strong understanding of concepts around software engineering and computer science.
- Hands on experience with pricing and underwriting model development and implementation an asset
- Literacy in modern financial theory (e.g., risk, pricing, portfolio construction) and/or insurance modelling an asset
#LI-Hybrid
#IndHP
Gore Mutual Insurance is committed to providing accommodations for people with disabilities during all phases of the recruiting process, including the application process. If you require accommodation because of a disability, we will work with you to meet your needs. If you are selected for an interview and require accommodation, please advise the HR representative who will consult with you to determine an appropriate accommodation.