Loading Now

Summary of An Autotuning-based Optimization Framework For Mixed-kernel Svm Classifications in Smart Pixel Datasets and Heterojunction Transistors, by Xingfu Wu and Tupendra Oli and Justin H. Qian and Valerie Taylor and Mark C. Hersam and Vinod K. Sangwan


An Autotuning-based Optimization Framework for Mixed-kernel SVM Classifications in Smart Pixel Datasets and Heterojunction Transistors

by Xingfu Wu, Tupendra Oli, Justin H. Qian, Valerie Taylor, Mark C. Hersam, Vinod K. Sangwan

First submitted to arxiv on: 26 Jun 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Performance (cs.PF)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper proposes an autotuning-based optimization framework to optimize hyperparameters in Support Vector Machines (SVMs) for high-dimensional data. The framework is applied to two SVMs with mixed-kernel combinations of Sigmoid and Gaussian kernels, used for smart pixel datasets in high-energy physics (HEP) and mixed-kernel heterojunction transistors (MKH). Results show that optimal hyperparameter selection varies greatly between applications and datasets, highlighting the importance of proper tuning for high classification accuracy. The framework effectively quantifies proper ranges for hyperparameters to achieve high accuracy rates of 94.6% for HEP and 97.2% with reduced tuning time for MKH.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper helps us find the best settings for a special kind of machine learning model called Support Vector Machines (SVMs). SVMs are great at classifying things, but they can be tricky to set up just right. The authors came up with a new way to figure out the best settings for two types of SVMs that combine different kinds of data. They tested their method on real-world datasets and found that it really works! By using this method, we can make sure our SVMs are as accurate as possible.

Keywords

» Artificial intelligence  » Classification  » Hyperparameter  » Machine learning  » Optimization  » Sigmoid