Summary of Graph Neural Networks on Discriminative Graphs Of Words, by Yassine Abbahaddou et al.
Graph Neural Networks on Discriminative Graphs of Words
by Yassine Abbahaddou, Johannes F. Lutzeyer, Michalis Vazirgiannis
First submitted to arxiv on: 27 Oct 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Computation and Language (cs.CL)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary The proposed Discriminative Graph of Words Graph Neural Network (DGoW-GNN) approach presents a novel discriminative graph construction and model for text classification. Unlike traditional methods, DGoW-GNN constructs a heterogeneous graph containing only word nodes, splitting the training corpus into disconnected subgraphs based on their labels and weighting edges by the pointwise mutual information of represented words. This reformulates text classification as walk classification. The approach combines a GNN with a sequence model and is evaluated on seven benchmark datasets, outperforming several state-of-the-art baseline models. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary A new way to classify text uses a special kind of graph neural network called DGoW-GNN. Unlike other methods that mix words and documents together, DGoW-GNN only looks at word nodes in the graph. It breaks up the training data into smaller groups based on what they’re about, then connects them with edges that show how similar the words are to each other. This lets it reform the task of text classification into a new problem called walk classification. The approach combines two types of models: one for graphs and one for sequences. It’s tested on seven datasets and finds some state-of-the-art models do better, but still has room to improve. |
Keywords
» Artificial intelligence » Classification » Gnn » Graph neural network » Sequence model » Text classification