Responsible AI integration for Jupyter Notebooks

Jupyter Notebooks can be used in many ways as many tools and widgets are available. As the fields of Explainable AI and Responsible AI are growing fast, tools specifically focusing on these were already developed. In the next sections we introduce three of the more popular tools.

2024 we investigated how popular these measured by research interest on google scholar and found the following:

  • explainerdashboard - 50 papers (mainly 2022-23, less researched and cited)
  • shapash - 230 papers (seems to be used a lot)
  • dalex – 340 papers (seems to be used a lot)

1) Explainerdashboard

Link: https://explainerdashboard.readthedocs.io/en/latest/

Explainerdashboard is a python library and can be used to build interactive dashboards with which AI predictions can be analyzed and explained. It is compatible with scikit-learn machine learning models, for example, with xgboost. Popular Explainable AI methods such as SHAP values can be investigated and displayed. A complete dashboard can be create, alternatively InlineExplainer can be used to integrate dashboard elements into the notebook as a n output.

2) Shapash

Link: https://github.com/MAIF/shapash

With shapash dashboards can be created to make machine learning more interpretable with a focus on local and global explainability.

3) Dalex

Link: https://pypi.org/project/dalex/

Dalex can be used to explain a models behavior to be able to understand black box models better. A main explainer can be wrapped around a predicting model, which then can be investigated. Interactive dashboards can be created, for example, to use fairness methods (see https://dalex.drwhy.ai/python-dalex-fairness.html).

ENKIS Berlin EU BMFTR FU

Imprint

Privacy Notice