• Skip to main content
  • Skip to header right navigation
  • Skip to site footer
CCAIM

CCAIM

Novel AI to transform healthcare

  • Home
  • About
    • Our Aims
    • ISAB Report
  • People
    • Leadership
    • Faculty
    • Associate Faculty
    • Joint Steering Committee
    • Independent Scientific Advisory Board
    • Staff
    • Our students
    • Affiliated clinical PhD Students
    • Visitors
  • Research
    • Papers
    • Breakthroughs
    • Software
      • AutoPrognosis
      • HyperImpute
      • Interpretability Suite
      • Synthcity
      • TemporAI
    • Demonstrators
    • Research Update: COVID-19
    • Blog
  • News
    • Latest News
    • COVID-19 News
  • Events
    • Seminar Series
    • WeCREATE
    • Inaugural Event
    • AI Clinic 2023
    • AI Clinic 2022
  • Summer School
    • Summer School 2023
      • Participate
      • Program
      • Topics
      • Industry
      • FAQ
    • Summer School 2022
  • Get involved
    • PhD Programmes
    • Clinical PhD Position
    • Partners
    • Connect

Interpretability Suite

24 April 2023 by Andreas Bedorf

A comprehensive collection of Machine Learning interpretability methods, offering users a reference to select the best-suited method for their needs, with a focus on providing insights into ML model predictions through organisation, a common Python interface, and a user interface.

Interpretability Method selection flowchart

How is it unique?

Interpretability Suite stands out by organizing various ML interpretability methods, both from the van der Schaar lab and third-party sources, into tables and flow-diagrams based on their use-case and explanation. It supports tabular datasets, time-series data, unsupervised model explainers, and individualised treatment effect explainers, providing both a Python interface and a visual user interface for seamless implementation and analysis.

How is it useful?

Interpretability Suite can, for instance:

1. Enhance transparency and trust in ML models by providing clear explanations for model predictions, allowing stakeholders to better understand the reasoning behind critical decisions.

2. Improve collaboration between teams by offering a visual user interface that enables non-technical team members to view and analyse explanations without Python knowledge.

3. Accelerate the implementation of interpretability methods by providing a standardised Python interface, streamlining the integration of explainers into existing projects.

4. Foster informed decision-making by guiding users to the most appropriate interpretability methods based on their specific use-case and data type.

With Interpretability Suite, users and teams can ensure that ML models are more transparent, understandable, and trustworthy, fostering collaboration and informed decision-making across the industry.

GitHub
User Interface
Documentation
Introductory Video
Tutorials
Category: Impact, News, Research, Software
Previous Post: « Conference Season 2023: 20 papers from CCAIM accepted at AISTATS and ICLR
Next Post: Synthcity »

Navigation

Home

News

About

University of Cambridge

  • University A-Z
  • Contact the University
  • Accessibility
  • Data Protection
  • Terms and conditions

Newsletter

Sign-up for updates on our research.

Follow us

  • Twitter
  • LinkedIn
  • YouTube

Copyright © 2023 CCAIM

Return to top

We are using cookies to give you the best experience on our website.

You can find out more about which cookies we are using or switch them off in settings.

CCAIM
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

Strictly Necessary Cookies

Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.

If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.

3rd Party Cookies

This website uses Google Analytics to collect anonymous information such as the number of visitors to the site, and the most popular pages.

Keeping this cookie enabled helps us to improve our website.

Please enable Strictly Necessary Cookies first so that we can save your preferences!