Loading Now

Summary of Non-convex Tensor Recovery From Local Measurements, by Tongle Wu et al.


Non-Convex Tensor Recovery from Local Measurements

by Tongle Wu, Ying Sun, Jicong Fan

First submitted to arxiv on: 23 Dec 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: None

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper proposes a novel tensor compressed sensing model that leverages low tubal rank structure to recover unknown tensors from limited measurements. The model reparameterizes the unknown tensor using two compact factors and formulates the recovery problem as a nonconvex minimization problem. To solve this, an alternating minimization algorithm is proposed, termed Alt-PGD-Min, which iteratively optimizes the two factors using projected gradient descent and exact minimization steps. This approach achieves ε-accuracy recovery with O(κ^2 log(1/ε)) iteration complexity and O(κ6rn_3logn_3(κ2r(n_1+n_2)+n_1log(1/ε))) sample complexity, where κ denotes the tensor condition number. To further accelerate convergence for ill-conditioned tensors, an accelerated version, Alt-ScalePGD-Min, is proposed, which preconditions the gradient update using an approximate Hessian that can be efficiently computed. This approach achieves O(log(1/ε)) iteration complexity and improves sample complexity to O(κ4rn_3logn_3(κ4r(n_1+n_2)+n_1log(1/ε))). The proposed methods are experimentally validated.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper is about a new way to recover unknown tensors from limited measurements. It’s like trying to piece together a puzzle with only some of the pieces. The researchers propose an algorithm that uses two compact factors to represent the unknown tensor and then tries to find the best combination of these factors. They show that this approach can be very efficient, especially when the tensor is not too complicated. They also propose a way to make their algorithm work even better for really tricky tensors. This paper could help people in many fields, like computer vision or medical imaging, where they need to recover information from incomplete data.

Keywords

» Artificial intelligence  » Gradient descent