EconPapers    
Economics at your fingertips  
 

EnergAI: A Large Language Model-Driven Generative Design Method for Early-Stage Building Energy Optimization

Jing Zhong, Peilin Li, Ran Luo, Jun Yin, Yizhen Ding, Junjie Bai, Chuxiang Hong, Xiang Deng, Xintong Ma () and Shuai Lu
Additional contact information
Jing Zhong: Shenzhen International Graduate School, Tsinghua University, Shenzhen 518055, China
Peilin Li: College of Design and Engineering, National University of Singapore, Singapore 117566, Singapore
Ran Luo: School of Architecture, South China University of Technology, Guangzhou 510641, China
Jun Yin: Shenzhen International Graduate School, Tsinghua University, Shenzhen 518055, China
Yizhen Ding: School of Architecture, Southeast University, Nanjing 210096, China
Junjie Bai: School of Architecture, South China University of Technology, Guangzhou 510641, China
Chuxiang Hong: College of Architecture and Urban Planning, Tongji University, Shanghai 200092, China
Xiang Deng: Shenzhen International Graduate School, Tsinghua University, Shenzhen 518055, China
Xintong Ma: Shenzhen International Graduate School, Tsinghua University, Shenzhen 518055, China
Shuai Lu: Shenzhen International Graduate School, Tsinghua University, Shenzhen 518055, China

Energies, 2025, vol. 18, issue 22, 1-27

Abstract: The early stage of architectural design plays a decisive role in determining building energy performance, yet conventional evaluation is typically deferred to later phases, restricting timely and data-informed feedback. This paper proposes EnergAI, a generative design framework that incorporates energy optimization objectives directly into the scheme generation process through large language models (e.g., GPT-4o, DeepSeek-V3.1-Think, Qwen-Max, and Gemini-2.5 pro). A dedicated dataset, LowEnergy-FormNet, comprising 2160 cases with site parameters, massing descriptors, and simulation outputs, was constructed to model site, form, and energy relationships. The framework encodes building massing into a parametric vector representation and employs hierarchical prompt strategies to establish a closed-loop compatibility with ClimateStudio. Experimental evaluations demonstrate that geometry-oriented and fuzzy-goal prompts achieve average annual reductions of approximately 16–17% in energy use intensity and 3–4% in energy cost compared with human designs, while performance-oriented structured prompts deliver the most reliable improvements, eliminating high-energy outliers and yielding an average EUI-saving rate above 50%. In cross-model comparisons under an identical toolchain, GPT-4o delivered the strongest and most stable optimization, achieving 63.3% mean EUI savings, nearly 13% higher than DeepSeek-V3.1-Think, Qwen-Max, and Gemini-2.5 baselines. These results demonstrate the feasibility and indicate the potential robustness of embedding performance constraints at the generation stage, providing a feasible approach to support proactive, data-informed early design.

Keywords: early-stage architectural design; building energy optimization; large language models (LLMs); performance-driven design (search for similar items in EconPapers)
JEL-codes: Q Q0 Q4 Q40 Q41 Q42 Q43 Q47 Q48 Q49 (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/1996-1073/18/22/5921/pdf (application/pdf)
https://www.mdpi.com/1996-1073/18/22/5921/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jeners:v:18:y:2025:i:22:p:5921-:d:1791662

Access Statistics for this article

Energies is currently edited by Ms. Cassie Shen

More articles in Energies from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-11-11
Handle: RePEc:gam:jeners:v:18:y:2025:i:22:p:5921-:d:1791662