Loading Now

Summary of Compact Model Parameter Extraction Via Derivative-free Optimization, by Rafael Perez Martinez et al.


Compact Model Parameter Extraction via Derivative-Free Optimization

by Rafael Perez Martinez, Masaya Iwamoto, Kelly Woo, Zhengliang Bian, Roberto Tinti, Stephen Boyd, Srabanti Chowdhury

First submitted to arxiv on: 24 Jun 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Systems and Control (eess.SY)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper addresses the challenge of compact model parameter extraction using derivative-free optimization to simultaneously extract tens of parameters. The traditional manual process can take days or weeks, dividing the complete set of parameters into smaller subsets for different operational regions. Our approach streamlines this by employing derivative-free optimization to identify a good parameter set that best fits the compact model without exhaustive simulations. We enhance the optimization process by carefully choosing a loss function focusing on relative errors, prioritizing accuracy above a specific threshold, and reducing sensitivity to outliers. A train-test split is used to assess model fit and avoid overfitting. Our approach successfully models a diamond Schottky diode with SPICE and a GaN-on-SiC HEMT with ASM-HEMT, extracting 35 parameters in under 6,000 trials for the latter. Additional examples demonstrate robustness to outliers, achieving an excellent fit even with over 25% corrupted data.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper makes it easier to understand how devices work by using a new way of finding the right settings. Normally, this process takes a long time because it involves trying many different combinations of settings and testing which one works best. The researchers created a new method that can find the right settings much faster. They tested their method on two types of devices: diamond Schottky diodes and GaN-on-SiC HEMTs. Their approach was able to correctly model these devices by finding the right settings in just 6,000 trials. This is important because it makes it easier for people to understand how devices work and can help them design better ones.

Keywords

» Artificial intelligence  » Gan  » Loss function  » Optimization  » Overfitting