Summary of Large Scale Transfer Learning For Tabular Data Via Language Modeling, by Josh Gardner et al.
Large Scale Transfer Learning for Tabular Data via Language Modeling
by Josh Gardner, Juan C. Perdomo, Ludwig Schmidt
First submitted to arxiv on: 17 Jun 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Artificial Intelligence (cs.AI); Computation and Language (cs.CL)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary This research presents a novel language model for tabular prediction called TabuLa-8B. The authors extract a large, high-quality training dataset from the TabLib corpus by filtering and controlling the quality of tabular data. They fine-tune a Llama 3-8B large language model (LLM) using a packing and attention scheme designed specifically for tabular prediction. Evaluation across 329 datasets shows that TabuLa-8B achieves zero-shot accuracy on unseen tables, outperforming existing state-of-the-art models like XGBoost and TabPFN by over 15 percentage points. In the few-shot setting, TabuLa-8B is more accurate than these models even when they have access to more data. The authors release their model, code, and data alongside this paper. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary Imagine having a super powerful tool that can look at tables of numbers and figures, and instantly know what kind of information it’s talking about. That’s what scientists are working on with something called TabuLa-8B. They’re trying to make computers better at understanding spreadsheets by training them on lots of data from the internet. This new tool is really good at predicting what’s in a table, even if it has never seen that kind of table before. It’s also faster and more accurate than other tools that are currently being used. |
Keywords
» Artificial intelligence » Attention » Few shot » Language model » Large language model » Llama » Xgboost » Zero shot