Summary of Transfer Learning on Transformers For Building Energy Consumption Forecasting — a Comparative Study, by Robert Spencer et al.
Transfer Learning on Transformers for Building Energy Consumption Forecasting – A Comparative Study
by Robert Spencer, Surangika Ranathunga, Mikael Boulic, Andries van Heerden, Teo Susnjak
First submitted to arxiv on: 18 Oct 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: None
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary The study explores the application of Transfer Learning (TL) on Transformer architectures to improve building energy consumption forecasting. By experimenting with six different data-centric TL strategies, the authors analyze their performance across various feature spaces. The experiments use 16 datasets from the Building Data Genome Project 2 to create building energy consumption forecasting models. Results show that while TL is generally beneficial, careful selection of the exact TL strategy is crucial to maximize its benefits. Feature space properties, such as recorded weather features, play a significant role in this decision-making process. The study finds that PatchTST outperforms other Transformer variants (vanilla Transformer and Informer). This work advances building energy consumption forecasting using advanced approaches like TL and Transformer architectures. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary The study looks at how to use special kinds of artificial intelligence called Transformers to forecast how much energy a building will use. This is important because it can help us save energy and reduce our impact on the environment. The authors tested different ways of doing this and found that one way, called PatchTST, works better than others. They also found that using something called Transfer Learning can make the predictions more accurate. This research helps us understand how to use advanced AI techniques like Transformers and Transfer Learning to forecast energy usage. |
Keywords
» Artificial intelligence » Transfer learning » Transformer