EconPapers    
Economics at your fingertips  
 

Smart buildings energy consumption forecasting using adaptive evolutionary bagging extra tree learning models

Mehdi Neshat, Menasha Thilakaratne, Mohammed El-Abd, Seyedali Mirjalili, Amir H. Gandomi and John Boland

Energy, 2025, vol. 333, issue C

Abstract: Smart buildings are gaining popularity because they have the capability to enhance energy efficiency, lower costs, improve security, and provide a more comfortable and convenient environment for building occupants. A considerable ratio of the global energy supply has been consumed in building sectors and plays a pivotal role in the future decarbonisation pathways. In order to manage energy consumption and improve energy efficiency in smart buildings, developing reliable and accurate energy demand forecasting is crucial and meaningful. However, extending an effective predictive model for the total energy use of appliances at the buildings’ level is challenging due to temporal oscillations and complex linear and non-linear patterns. This paper proposes three hybrid ensemble predictive models, incorporating Bagging, Stacking, and Voting mechanisms combined with a fast and effective evolutionary hyper-parameters tuner. The performance of the proposed energy forecasting model was evaluated using a hybrid dataset of meteorological parameters, energy use of appliances, temperature, humidity, and lighting energy consumption from different sections collected by 18 sensors in a building located in Stambruges, Mons in Belgium. In order to provide a comparative framework and investigate the efficiency of the proposed predictive model, 15 popular machine learning (ML) models, including two classic ML models, three Neural Networks (NN), a Decision Tree (DT), a Random Forest (RF), two Deep Learning (DL) and six Ensemble models, were compared. The prediction results indicate that the adaptive evolutionary bagging model surpassed other predictive models in both accuracy and learning error. Notably, it delivered accuracy gains of 12.6%, 13.7%, 12.9%, 27.04%, and 17.4% when compared to Extreme Gradient Boosting (XGB), Categorical Boosting (CatBoost), Gradient Boosting Machine (GBM), Light Gradient Boosting Machine (LGBM), and RF.

Keywords: Smart building; Energy forecasting; Deep learning; Ensemble learning; Extra tree; Optimisation; Hyper-parameter tuning (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0360544225027720
Full text for ScienceDirect subscribers only

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:eee:energy:v:333:y:2025:i:c:s0360544225027720

DOI: 10.1016/j.energy.2025.137130

Access Statistics for this article

Energy is currently edited by Henrik Lund and Mark J. Kaiser

More articles in Energy from Elsevier
Bibliographic data for series maintained by Catherine Liu ().

 
Page updated 2025-07-29
Handle: RePEc:eee:energy:v:333:y:2025:i:c:s0360544225027720