EconPapers    
Economics at your fingertips  
 

A comparative study of multi‐class support vector machines in the unifying framework of large margin classifiers

Yann Guermeur, André Elisseeff and Dominique Zelus

Applied Stochastic Models in Business and Industry, 2005, vol. 21, issue 2, 199-214

Abstract: Vapnik's statistical learning theory has mainly been developed for two types of problems: pattern recognition (computation of dichotomies) and regression (estimation of real‐valued functions). Only in recent years has multi‐class discriminant analysis been studied independently. Extending several standard results, among which a famous theorem by Bartlett, we have derived distribution‐free uniform strong laws of large numbers devoted to multi‐class large margin discriminant models. The capacity measure appearing in the confidence interval, a covering number, has been bounded from above in terms of a new generalized VC dimension. In this paper, the aforementioned theorems are applied to the architecture shared by all the multi‐class SVMs proposed so far, which provides us with a simple theoretical framework to study them, compare their performance and design new machines. Copyright © 2005 John Wiley & Sons, Ltd.

Date: 2005
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://doi.org/10.1002/asmb.534

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:wly:apsmbi:v:21:y:2005:i:2:p:199-214

Access Statistics for this article

More articles in Applied Stochastic Models in Business and Industry from John Wiley & Sons
Bibliographic data for series maintained by Wiley Content Delivery ().

 
Page updated 2025-03-20
Handle: RePEc:wly:apsmbi:v:21:y:2005:i:2:p:199-214