Loading Now

Summary of Amortized Shap Values Via Sparse Fourier Function Approximation, by Ali Gorji et al.


Amortized SHAP values via sparse Fourier function approximation

by Ali Gorji, Andisheh Amrollahi, Andreas Krause

First submitted to arxiv on: 8 Oct 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: None

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The paper tackles the challenge of efficiently computing SHAP values, a popular method for local feature-attribution in interpretable AI. The authors propose a two-stage approach to estimate SHAP values in both model-agnostic (black-box) and tree-based settings. For black-box models, they leverage recent results on spectral bias to approximate the model using a compact Fourier representation. In the second step, they use this representation to exactly compute SHAP values, which can be done efficiently due to the closed-form expression derived for Fourier basis functions. This approach allows for amortized computation of SHAP values and introduces a continuous trade-off between computation and accuracy through sparsity control. The authors demonstrate speedups compared to baseline methods while maintaining equal levels of accuracy in both tree-based and black-box settings.
Low GrooveSquid.com (original content) Low Difficulty Summary
Imagine you have a machine learning model that can make predictions, but you want to know which features or characteristics are most important for making those predictions. SHAP values help with this by telling you the value of each feature for a specific prediction. The problem is that computing these values can be slow and difficult. This paper proposes a new way to compute SHAP values that is faster and more efficient. They use a combination of mathematical techniques and special algorithms to make it possible to compute these values quickly while still being very accurate. The authors show that their method works well for both simple tree-based models and complex black-box models, and can even adjust the trade-off between speed and accuracy.

Keywords

* Artificial intelligence  * Machine learning