Loading Now

Summary of Mokd: Cross-domain Finetuning For Few-shot Classification Via Maximizing Optimized Kernel Dependence, by Hongduan Tian et al.


MOKD: Cross-domain Finetuning for Few-shot Classification via Maximizing Optimized Kernel Dependence

by Hongduan Tian, Feng Liu, Tongliang Liu, Bo Du, Yiu-ming Cheung, Bo Han

First submitted to arxiv on: 29 May 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Computer Vision and Pattern Recognition (cs.CV)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
The proposed nearest centroid classifier (NCC) aims to learn representations for cross-domain few-shot classification by constructing a metric space where similarities between samples and prototype classes can be measured. However, the NCC-learned representations exhibit high similarity between samples from different classes. To address this issue, the authors introduce a bi-level optimization framework, maximizing optimized kernel dependence (MOKD), which learns class-specific representations matching labeled data cluster structures. MOKD first optimizes the Hilbert-Schmidt independence criterion (HSIC) kernel to capture dependence more precisely, then addresses an optimization problem to maximize label-dependence and minimize sample-dependence using the optimized HSIC (opt-HSIC). The authors demonstrate MOKD’s effectiveness on Meta-Dataset, achieving better generalization performance and learning improved data representation clusters.
Low GrooveSquid.com (original content) Low Difficulty Summary
Cross-domain few-shot classification uses a nearest centroid classifier (NCC) to learn representations. However, NCC-learned representations have high similarity between samples from different classes. To fix this, researchers propose a new approach called maximizing optimized kernel dependence (MOKD). MOKD learns class-specific representations that match labeled data cluster structures. It does this by optimizing the Hilbert-Schmidt independence criterion (HSIC) kernel to better capture relationships and then solves an optimization problem to maximize label-relatedness and minimize sample-similarities. This helps MOKD learn better data representation clusters and achieve better generalization performance on new, unseen domains.

Keywords

» Artificial intelligence  » Classification  » Few shot  » Generalization  » Optimization