Enhancing Neural Architecture Search Using Transfer Learning and Dynamic Search Spaces for Global Horizontal Irradiance Prediction
Inoussa Legrene (),
Tony Wong and
Louis-A. Dessaint
Additional contact information
Inoussa Legrene: Systems Engineering Department, École de Technologie Supérieure, Montréal, QC H3C 1K3, Canada
Tony Wong: Systems Engineering Department, École de Technologie Supérieure, Montréal, QC H3C 1K3, Canada
Louis-A. Dessaint: Electrical Engineering Department, École de Technologie Supérieure, Montréal, QC H3C 1K3, Canada
Forecasting, 2025, vol. 7, issue 3, 1-23
Abstract:
The neural architecture search technique is used to automate the engineering of neural network models. Several studies have applied this approach, mainly in the fields of image processing and natural language processing. Its application generally requires very long computing times before converging on the optimal architecture. This study proposes a hybrid approach that combines transfer learning and dynamic search space adaptation (TL-DSS) to reduce the architecture search time. To validate this approach, Long Short-Term Memory (LSTM) models were designed using different evolutionary algorithms, including artificial bee colony (ABC), genetic algorithm (GA), differential evolution (DE), and particle swarm optimization (PSO), which were developed to predict trends in global horizontal irradiation data. The performance measures of this approach include the performance of the proposed models, as evaluated via RMSE over a 24-h prediction window of the solar irradiance data trend on one hand, and CPU search time on the other. The results show that, in addition to reducing the search time by up to 89.09% depending on the search algorithm, the proposed approach enables the creation of models that are up to 99% more accurate than the non-enhanced approach. This study demonstrates that it is possible to reduce the search time of a neural architecture while ensuring that models achieve good performance.
Keywords: dynamic search space; evolutionary algorithms; long short-term memory; neural architecture search; transfer learning (search for similar items in EconPapers)
JEL-codes: A1 B4 C0 C1 C2 C3 C4 C5 C8 M0 Q2 Q3 Q4 (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2571-9394/7/3/43/pdf (application/pdf)
https://www.mdpi.com/2571-9394/7/3/43/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jforec:v:7:y:2025:i:3:p:43-:d:1722788
Access Statistics for this article
Forecasting is currently edited by Ms. Joss Chen
More articles in Forecasting from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().