Learning from dependent observations
Ingo Steinwart,
Don Hush and
Clint Scovel
Journal of Multivariate Analysis, 2009, vol. 100, issue 1, 175-194
Abstract:
In most papers establishing consistency for learning algorithms it is assumed that the observations used for training are realizations of an i.i.d. process. In this paper we go far beyond this classical framework by showing that support vector machines (SVMs) only require that the data-generating process satisfies a certain law of large numbers. We then consider the learnability of SVMs for [alpha]-mixing (not necessarily stationary) processes for both classification and regression, where for the latter we explicitly allow unbounded noise.
Keywords: primary; 68T05 (1985) secondary; 62G08 (2000); 62H30 (1973); 62M45 (2000); 68Q32 (2000) Support vector machine Consistency Non-stationary mixing process Classification Regression (search for similar items in EconPapers)
Date: 2009
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (6)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0047-259X(08)00109-7
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:jmvana:v:100:y:2009:i:1:p:175-194
Ordering information: This journal article can be ordered from
http://www.elsevier.com/wps/find/supportfaq.cws_home/regional
https://shop.elsevie ... _01_ooc_1&version=01
Access Statistics for this article
Journal of Multivariate Analysis is currently edited by de Leeuw, J.
More articles in Journal of Multivariate Analysis from Elsevier
Bibliographic data for series maintained by Catherine Liu ().