Loading Now

Summary of Enabling Tensor Decomposition For Time-series Classification Via a Simple Pseudo-laplacian Contrast, by Man Li et al.


Enabling Tensor Decomposition for Time-Series Classification via A Simple Pseudo-Laplacian Contrast

by Man Li, Ziyue Li, Lijun Sun, Fugee Tsung

First submitted to arxiv on: 23 Sep 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: None

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
In this paper, researchers propose a novel framework called Pseudo Laplacian Contrast (PLC) tensor decomposition that can extract class-aware representations from data while preserving the intrinsic low-rank structure. The authors argue that traditional tensor decomposition methods are not suitable for classification tasks due to their non-uniqueness and rotation invariance. Instead, they introduce a simple graph Laplacian-based approach to identify the directions with largest class-variability. The PLC framework integrates data augmentation and cross-view Laplacian to enable extraction of class-aware representations while minimizing reconstruction error. An unsupervised alternative optimization algorithm is developed to iteratively estimate the pseudo graph and minimize loss using Alternating Least Square (ALS). Experimental results on various datasets demonstrate the effectiveness of the proposed approach.
Low GrooveSquid.com (original content) Low Difficulty Summary
Tensor decomposition helps learn low-dimensional representation, which can be useful for tasks like completing missing data or imputing values. But it’s not very good at helping with classification, where we want to know what category something belongs to. Researchers think they can do better by using a special kind of graph Laplacian that helps identify important features. They came up with a new way to combine this with some other techniques to get class-aware representations. This means the computer will understand more about how different categories are related. The results show that this approach works well on different datasets.

Keywords

» Artificial intelligence  » Classification  » Data augmentation  » Optimization  » Unsupervised