EconPapers    
Economics at your fingertips  
 

Elements of Computational Learning Theory

Ke-Lin Du () and M. N. S. Swamy
Additional contact information
Ke-Lin Du: Concordia University, Department of Electrical and Computer Engineering
M. N. S. Swamy: Concordia University, Department of Electrical and Computer Engineering

Chapter Chapter 3 in Neural Networks and Statistical Learning, 2019, pp 65-79 from Springer

Abstract: Abstract PAC learning theory is the foundation of computational learning theory. VC-dimension, Rademacher complexity, and empirical risk-minimization principle are three concepts for deriving a generalization error bound for a trained machine. The fundamental theorem of learning theory relates PAC learnability, VC-dimension, and empirical risk-minimization principle. Another basic theorem in computational learning theory is no-free-lunch theorem. These topics are addressed in this chapter.

Date: 2019
References: Add references at CitEc
Citations:

There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:sprchp:978-1-4471-7452-3_3

Ordering information: This item can be ordered from
http://www.springer.com/9781447174523

DOI: 10.1007/978-1-4471-7452-3_3

Access Statistics for this chapter

More chapters in Springer Books from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-11-30
Handle: RePEc:spr:sprchp:978-1-4471-7452-3_3