Interpreting Hugging Face transformers with SHAP
In this section, we will interpret the Hugging Face transformers with SHAP. The Hugging Face platform provides an interface for an impressive list of transformer models.The section is divided into two parts:
- Introducing SHAP
- Explaining Hugging Face outputs with SHAP
Introducing SHAP
In Game Theory, a Shapley value expresses the distribution of the total values among "players" through their marginal contribution. In a sentence, the words are the "players." Each word will have a score. The total score is the value of the game. The value of each word is calculated over all the permutations of the sentence.The goal is to see how each word changes the meaning of a sentence.For example, there are seven words in the following sentence: "I love playing chess with my friends"The total number of permutations = !7= 7x6x5x4x3x2x1= 5040.The immediate conclusion is that SHAP will be challenging for a long text. However...