Loading Now

Summary of Adaptive Conditional Expert Selection Network For Multi-domain Recommendation, by Kuiyao Dong et al.


Adaptive Conditional Expert Selection Network for Multi-domain Recommendation

by Kuiyao Dong, Xingyu Lou, Feng Liu, Ruian Wang, Wenyi Yu, Ping Wang, Jun Wang

First submitted to arxiv on: 11 Nov 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Information Retrieval (cs.IR)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper proposes a novel approach to Multi-domain recommendation (MDR) using Mixture-of-Experts (MOE), which has recently become the de facto standard due to its powerful expressive ability. The proposed method, called CESAA, consists of Conditional Expert Selection (CES) and Adaptive Expert Aggregation (AEA) modules to address scalability issues and improve model performance. The CES module combines a sparse gating strategy with domain-shared experts, while the AEA module utilizes mutual information loss to strengthen correlations between experts and domains. This approach enables only domain-shared and selected domain-specific experts to be activated for each instance, striking a balance between computational efficiency and model performance. Experimental results on public ranking and industrial retrieval datasets verify the effectiveness of CESAA in MDR tasks.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper is about making recommendations that work well across many different areas or domains. It’s using a type of artificial intelligence called Mixture-of-Experts (MOE) to do this. MOE is good at learning lots of things, but it can be slow and not very good at telling the difference between different areas. The researchers came up with a new way to use MOE that makes it faster and better at recommending things for specific areas. They call this new method CESAA. It has two parts: one helps decide which experts (or people) are best for each area, and the other makes sure those experts work together well. This approach is helpful because it can recommend things quickly without sacrificing how good the recommendations are.

Keywords

» Artificial intelligence  » Mixture of experts