Least squares support vector regression for differential equations on unbounded domains
A. Pakniyat,
K. Parand and
M. Jani
Chaos, Solitons & Fractals, 2021, vol. 151, issue C
Abstract:
In this paper, a numerical method based on the least-squares support vector regression, and spectral methods are developed for solving differential equations on unbounded domains. In the proposed method, Hermite functions are used as the orthogonal kernel of the support vector regression. The resulting optimization problem is then reduced to a linear system in both collocation and Galerkin approaches of the method. The systems are then analyzed, along with a discussion of the sparsity of the involving matrices. Providing some numerical examples, including fractional differential equations, the accuracy and efficiency of the method are illustrated and compared with some existing methods.
Keywords: Least squares support vector regression; Unbounded domain; Fractional differential equations; Hermite kernel; Galerkin LS-SVR; Collocation LS-SVR (search for similar items in EconPapers)
Date: 2021
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (2)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0960077921005865
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:chsofr:v:151:y:2021:i:c:s0960077921005865
DOI: 10.1016/j.chaos.2021.111232
Access Statistics for this article
Chaos, Solitons & Fractals is currently edited by Stefano Boccaletti and Stelios Bekiros
More articles in Chaos, Solitons & Fractals from Elsevier
Bibliographic data for series maintained by Thayer, Thomas R. ().