Treatment learning with Gini constraints by Heaviside composite optimization and a progressive method
Yue Fang (),
Junyi Liu () and
Jong-Shi Pang ()
Additional contact information
Yue Fang: The Chinese University of Hong Kong
Junyi Liu: Tsinghua University
Jong-Shi Pang: University of Southern California
Computational Optimization and Applications, 2025, vol. 92, issue 2, No 3, 513 pages
Abstract:
Abstract This paper proposes a Heaviside composite optimization approach and presents a progressive method for solving multi-treatment learning problems with non-convex constraints. A Heaviside composite function is a composite of a Heaviside function (i.e., the indicator function of either the open $$( \, 0,\infty )$$ or closed $$[ \, 0,\infty \, )$$ interval) with a possibly nondifferentiable function. Modeling-wise, we show how Heaviside composite optimization provides a rigorous mathematical formulation for learning multi-treatment rules subject to Gini constraints. A Heaviside composite function has an equivalent discrete formulation and the resulting optimization problem can in principle be solved by integer programming (IP) methods. Nevertheless, for constrained treatment learning problems with large datasets, a straightforward application of off-the-shelf IP solvers is usually ineffective in achieving global optimality. To alleviate such a computational burden, our major contribution is the proposal of the progressive method by leveraging the effectiveness of state-of-the-art IP solvers for problems of modest sizes. We provide the theoretical advantage of the progressive method with the connection to continuous optimization and show that the computed solution is locally optimal for a broad class of Heaviside composite optimization problems. The superior numerical performance of the proposed method is demonstrated by extensive computational experimentation. A brief discussion of how score-based and tree-based multi-classification problems can also be formulated as Heaviside composite optimization problems and thus treated by the same progressive method is presented in an appendix.
Keywords: Discontinuous (Heaviside) optimization; Constrained treatment learning problem; Multi-class classification; Progressive method (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
http://link.springer.com/10.1007/s10589-025-00706-8 Abstract (text/html)
Access to the full text of the articles in this series is restricted.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:coopap:v:92:y:2025:i:2:d:10.1007_s10589-025-00706-8
Ordering information: This journal article can be ordered from
http://www.springer.com/math/journal/10589
DOI: 10.1007/s10589-025-00706-8
Access Statistics for this article
Computational Optimization and Applications is currently edited by William W. Hager
More articles in Computational Optimization and Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().