Mean-Risk Stochastic Linear Programming Methods
Lewis Ntaimo
Additional contact information
Lewis Ntaimo: Texas A&M University
Chapter Chapter 7 in Computational Stochastic Programming, 2024, pp 277-348 from Springer
Abstract:
Abstract In this chapter, we study decompositionDecomposition methods for mean-risk two-stage stochastic linear programming (MR-SLP). We use structural properties of MR-SLP derived in Chap. 2 and decomposition techniques from Chap. 6 to derive solution algorithms for MR-SLP for quantile and deviation risk measures. Definitions of risk measures and deterministic equivalent problem (DEP) formulations are derived in Chap. 2. The risk measures quantile deviation (QDEV), conditional value-at-risk (CVaR), and expected excess EE have DEPs with dual block angular structure amenable to Benders decompositionDecompositionBenders decomposition. Therefore, we derive an L-shaped algorithmL-shaped algorithm termed, 𝔻 $$\mathbb {D}$$ -AGG algorithm, for 𝔻 ∈ { QDEV, CVaR, EE } $$\mathbb {D} \in \{\text{QDEV, CVaR, EE}\}$$ involving a single (aggregated) optimality cutOptimality cut at each iteration of the algorithm in Sect. 7.2. This is followed by the derivation of the 𝔻 $$\mathbb {D}$$ -SEP algorithm in Sect. 7.3 involving two separate optimality cutsOptimality cut, one for the expectation term and the other for the quantile (deviation) term of the DEP objective function. For the remainder of the chapter, we turn to the derivation of two subgradient-based algorithms for the deviation risk measureRisk measure absolute semideviation (ASD), termed ASD-AGG and ASD-SEP algorithms. Unlike MR-SLP with QDEV, CVaR, and EE, the DEP for ASD has a block angular structure due to a set of linking constraintsLinking constraint. Therefore, the L-shaped method is not applicable in this case, and that is why we consider a subgradient-based approach to tackle the problem. We derive the single optimality cutOptimality cut ASD-AGG algorithm in Sect. 7.4 and the separate optimality cut ASD-SEP algorithm in Sect. 7.5. Implementing (coding) algorithms for MR-SLP on a computer is not a trivial matter. Therefore, we include detailed numerical examples to illustrate the algorithms and provide some insights and guidelines for computer implementationComputer implementation.
Date: 2024
References: Add references at CitEc
Citations:
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:spochp:978-3-031-52464-6_7
Ordering information: This item can be ordered from
http://www.springer.com/9783031524646
DOI: 10.1007/978-3-031-52464-6_7
Access Statistics for this chapter
More chapters in Springer Optimization and Its Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().