Summary of Ghostrnn: Reducing State Redundancy in Rnn with Cheap Operations, by Hang Zhou et al.
GhostRNN: Reducing State Redundancy in RNN with Cheap Operations
by Hang Zhou, Xiaoxu Zheng, Yunhe Wang, Michael Bi Mi, Deyi Xiong, Kai Han
First submitted to arxiv on: 20 Nov 2024
Categories
- Main: Computation and Language (cs.CL)
- Secondary: Artificial Intelligence (cs.AI); Sound (cs.SD); Audio and Speech Processing (eess.AS)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary In this paper, researchers propose an efficient Recurrent Neural Network (RNN) architecture called GhostRNN, designed for low-resource devices. The model reduces hidden state redundancy using cheap operations, which is essential for real-world applications. By generating a few intrinsic states and then applying ghost states based on these intrinsic states, the GhostRNN significantly reduces memory usage (~40%) and computation cost while maintaining performance similar to existing RNN models. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary GhostRNN is an efficient RNN model that can be used in various speech tasks such as keyword spotting (KWS) and speech enhancement (SE). The model reduces hidden state redundancy using cheap operations, which makes it suitable for low-resource devices. By reducing the memory usage (~40%) and computation cost while maintaining performance similar to existing RNN models, GhostRNN is an important contribution to the field of efficient RNN architectures. |
Keywords
» Artificial intelligence » Neural network » Rnn