EconPapers    
Economics at your fingertips  
 

Bilevel Optimization of Regularization Hyperparameters in Machine Learning

Takayuki Okuno () and Akiko Takeda ()
Additional contact information
Takayuki Okuno: RIKEN AIP
Akiko Takeda: The University of Tokyo

Chapter Chapter 6 in Bilevel Optimization, 2020, pp 169-194 from Springer

Abstract: Abstract Most of the main machine learning (ML) models are equipped with parameters that need to be prefixed. Such parameters are often called hyperparameters. Needless to say, prediction performance of ML models significantly relies on the choice of hyperparameters. Hence, establishing methodology for properly tuning hyperparameters has been recognized as one of the most crucial matters in ML. In this chapter, we introduce the role of bilevel optimization in the context of selecting hyperparameters in regression and classification problems.

Keywords: Machine learning; Hyperparameter optimization; Nonsmooth bilevel optimization; Sparse regularizer; ℓ q-regularizer (search for similar items in EconPapers)
Date: 2020
References: Add references at CitEc
Citations:

There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:spochp:978-3-030-52119-6_6

Ordering information: This item can be ordered from
http://www.springer.com/9783030521196

DOI: 10.1007/978-3-030-52119-6_6

Access Statistics for this chapter

More chapters in Springer Optimization and Its Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-04-01
Handle: RePEc:spr:spochp:978-3-030-52119-6_6