A new paper titled, Performance evaluation of deep learning architectures for load and temperature forecasting under dataset size constraints and seasonality has been published.
Paper link
Summary
Buildings and their energy systems are characterized by unique and complex features that can evolve over time. Moreover, building data is highly seasonal, which means that it is subject to variations according to the time of the year. Therefore, an effective deep learning model with exceptional adaptability (i.e., high performance with little data) is required to accurately forecast and control the energy systems of buildings.
Considering the intricate nature of the building-energy field, we conducted a benchmark study to compare the performance of six different deep learning architectures. These included the multilayer perceptron (MLP), simple recurrent neural network (RNN), long short-term memory, gated recurrent unit (GRU), dilated convolutional neural network (DCNN), and transformer. To analyze the effect of data seasonality on forecasting performance, we also developed a data similarity analysis method.
To ensure the reproducibility and accessibility of our benchmark, we utilized a publicly accessible data generator and the open-source Python library DeepTimeSeries. Our forecasting targets were the zone temperatures and thermal loads over a future 24-hour period. The benchmark results, with varying training dataset sizes ranging from 0.3 to 0.9 y, showed that the transformer architecture performed the best, especially on small training datasets. The GRU and RNN came in second and third place, respectively, while the rankings of other architectures varied depending on the training dataset size and forecasting targets.
Additionally, the data similarity analysis revealed that simply increasing the training dataset size does not necessarily improve the model performance. This highlights the importance of model training with highly similar data for the forecasting period.
Featured figure
Matrix plot summarizing the performance rank of six deep learning architectures