Loading Now

Summary of Hyperparameter Optimization For Secureboost Via Constrained Multi-objective Federated Learning, by Yan Kang et al.


Hyperparameter Optimization for SecureBoost via Constrained Multi-Objective Federated Learning

by Yan Kang, Ziyao Ren, Lixin Fan, Linghua Yang, Yongxin Tong, Qiang Yang

First submitted to arxiv on: 6 Apr 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Cryptography and Security (cs.CR)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
SecureBoost is a tree-boosting algorithm that combines homomorphic encryption (HE) to safeguard data privacy in vertical federated learning. Initially designed for fields like finance and healthcare, SecureBoost’s hyperparameters are typically set heuristically to optimize model performance, assuming data privacy is secured. However, researchers discovered that some variants of SecureBoost remain vulnerable to label leakage. This vulnerability can lead to suboptimal trade-offs between utility, privacy, and efficiency in federated learning systems. To address this issue, the authors propose Constrained Multi-Objective SecureBoost (CMOSB), which aims to find Pareto optimal solutions balancing utility loss, training cost, and privacy leakage. The study includes novel measurements of these objectives, including an instance clustering attack (ICA) to assess SecureBoost’s privacy leakage, as well as two countermeasures against ICA. Experimental results demonstrate that CMOSB outperforms grid search and Bayesian optimization in achieving a superior trade-off between utility loss, training cost, and privacy leakage.
Low GrooveSquid.com (original content) Low Difficulty Summary
SecureBoost is an algorithm that helps protect data privacy in computer learning. It’s used in fields like finance and healthcare, but it has a problem: its settings can be wrong. This makes the model not very good at predicting things or keeping information private. To fix this, researchers created a new version called Constrained Multi-Objective SecureBoost (CMOSB). CMOSB tries to find the best balance between making good predictions, being efficient, and keeping data safe. The study includes some new ways to measure how well these goals are met, and it shows that CMOSB does a better job than other methods.

Keywords

* Artificial intelligence  * Boosting  * Clustering  * Federated learning  * Grid search  * Optimization