EconPapers    
Economics at your fingertips  
 

CISMN: A Chaos-Integrated Synaptic-Memory Network with Multi-Compartment Chaotic Dynamics for Robust Nonlinear Regression

Yaser Shahbazi (), Mohsen Mokhtari Kashavar, Abbas Ghaffari, Mohammad Fotouhi () and Siamak Pedrammehr
Additional contact information
Yaser Shahbazi: Faculty of Architecture and Urbanism, Tabriz Islamic Art University, Tabriz 5164736931, Iran
Mohsen Mokhtari Kashavar: Faculty of Architecture and Urbanism, Tabriz Islamic Art University, Tabriz 5164736931, Iran
Abbas Ghaffari: Faculty of Architecture and Urbanism, Tabriz Islamic Art University, Tabriz 5164736931, Iran
Mohammad Fotouhi: Faculty of Civil Engineering and Geosciences, Delft University of Technology, 2628 CN Delft, The Netherlands
Siamak Pedrammehr: Faculty of Design, Tabriz Islamic Art University, Tabriz 5164736931, Iran

Mathematics, 2025, vol. 13, issue 9, 1-37

Abstract: Modeling complex, non-stationary dynamics remains challenging for deterministic neural networks. We present the Chaos-Integrated Synaptic-Memory Network (CISMN), which embeds controlled chaos across four modules—Chaotic Memory Cells, Chaotic Plasticity Layers, Chaotic Synapse Layers, and a Chaotic Attention Mechanism—supplemented by a logistic-map learning-rate schedule. Rigorous stability analyses (Lyapunov exponents, boundedness proofs) and gradient-preservation guarantees underpin our design. In experiments, CISMN-1 on a synthetic acoustical regression dataset (541 samples, 22 features) achieved R 2 = 0.791 and RMSE = 0.059, outpacing physics-informed and attention-augmented baselines. CISMN-4 on the PMLB sonar benchmark (208 samples, 60 bands) attained R 2 = 0.424 and RMSE = 0.380, surpassing LSTM, memristive, and reservoir models. Across seven standard regression tasks with 5-fold cross-validation, CISMN led on diabetes (R 2 = 0.483 ± 0.073) and excelled in high-dimensional, low-sample regimes. Ablations reveal a scalability–efficiency trade-off: lightweight variants train in <10 s with >95% peak accuracy, while deeper configurations yield marginal gains. CISMN sustains gradient norms (~2300) versus LSTM collapse (<3), and fixed-seed protocols ensure <1.2% MAE variation. Interpretability remains challenging (feature-attribution entropy ≈ 2.58 bits), motivating future hybrid explanation methods. CISMN recasts chaos as a computational asset for robust, generalizable modeling across scientific, financial, and engineering domains.

Keywords: Chaos-Integrated Synaptic-Memory Network (CISMN); chaos theory; artificial neural networks; dynamic learning; machine learning; complex systems; nonlinear dynamics (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2227-7390/13/9/1513/pdf (application/pdf)
https://www.mdpi.com/2227-7390/13/9/1513/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:13:y:2025:i:9:p:1513-:d:1649142

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-05-10
Handle: RePEc:gam:jmathe:v:13:y:2025:i:9:p:1513-:d:1649142