Loading Now

Summary of Self-consistency Training For Density-functional-theory Hamiltonian Prediction, by He Zhang et al.


Self-Consistency Training for Density-Functional-Theory Hamiltonian Prediction

by He Zhang, Chang Liu, Zun Wang, Xinran Wei, Siyuan Liu, Nanning Zheng, Bin Shao, Tie-Yan Liu

First submitted to arxiv on: 14 Mar 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Chemical Physics (physics.chem-ph); Biomolecules (q-bio.BM)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This research proposes self-consistency training for predicting mean-field Hamiltonian matrices in density functional theory, a fundamental problem-solving approach in molecular science. The task is challenging due to insufficient labeled data, but the proposed method does not require labels and can leverage a large amount of unlabeled data. This exact training method enables generalization and addresses data scarcity challenges.
Low GrooveSquid.com (original content) Low Difficulty Summary
Predicting the mean-field Hamiltonian matrix in density functional theory is an important problem-solving approach for molecular science. However, it’s limited by insufficient labeled data. The researchers propose a new training method called self-consistency training that doesn’t need labels. This helps solve the data challenge and improves generalization.

Keywords

* Artificial intelligence  * Generalization