Loading Now

Summary of Data Distribution Distilled Generative Model For Generalized Zero-shot Recognition, by Yijie Wang and Mingjian Hong and Luwen Huangfu and Sheng Huang


Data Distribution Distilled Generative Model for Generalized Zero-Shot Recognition

by Yijie Wang, Mingjian Hong, Luwen Huangfu, Sheng Huang

First submitted to arxiv on: 18 Feb 2024

Categories

  • Main: Computer Vision and Pattern Recognition (cs.CV)
  • Secondary: Artificial Intelligence (cs.AI)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
In the realm of Zero-Shot Learning (ZSL), we introduce an end-to-end generative GZSL framework called D^3GZSL to counter biases in Generalized Zero-Shot Learning (GZSL) models. This framework respects seen and synthesized unseen data as in-distribution and out-of-distribution data, respectively. The framework consists of two core modules: in-distribution dual space distillation (ID^2SD) and out-of-distribution batch distillation (O^2DBD). ID^2SD aligns teacher-student outcomes in embedding and label spaces, enhancing learning coherence. O^2DBD introduces low-dimensional out-of-distribution representations per batch sample, capturing shared structures between seen and unseen categories. Our approach demonstrates its effectiveness across established GZSL benchmarks, seamlessly integrating into mainstream generative frameworks. Extensive experiments consistently showcase that D^3GZSL elevates the performance of existing generative GZSL methods.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper is about making computers learn new things without seeing them before. They use a special way to teach computers called Zero-Shot Learning (ZSL). The problem with this method is that it favors what it already knows, so the computer doesn’t learn as well. To fix this, the authors created a new way of teaching called D^3GZSL. It’s like having two teachers: one for things the computer has seen before and another for things it hasn’t seen. This helps the computer learn more evenly. The authors tested their method on several challenges and found that it works better than other methods.

Keywords

» Artificial intelligence  » Distillation  » Embedding  » Zero shot