A simple approach to discrete-time infinite horizon problems
Jan Brinkhuis
No EI2015-41, Econometric Institute Research Papers from Erasmus University Rotterdam, Erasmus School of Economics (ESE), Econometric Institute
Abstract:
In this note, we consider a type of discrete-time infinite horizon problem that has one ingredient only, a constraint correspondence. The value function of a policy has an intuitive mono- tonicity property; this is the essence of the four standard theorems on the functional equation (‘the Bellman equation’). Some insight is offered into the boundedness condition for the value function that occurs in the formulation of these results: it can be interpreted as accountability of the loss of value caused by a non-optimal policy or, alternatively, it can be interpreted as irrelevance of devia- tions, in the distant future, from the considered policy. Without the boundedness condition, there is a gap, which can be viewed as the persistent potential positive impact of deviations, in the distant future, from the considered policy. The general stationary discrete-time infinite horizon optimization problem considered in Stokey and Lucas (1989) can be mapped to this type of problems and so the results in the present paper can be applied to this general class of problems.
Keywords: discrete-time; infinite; horizon; problem (search for similar items in EconPapers)
Pages: 8
Date: 2015-12-04
References: Add references at CitEc
Citations:
Downloads: (external link)
https://repub.eur.nl/pub/79542/EI2015-41.pdf (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:ems:eureir:79542
Access Statistics for this paper
More papers in Econometric Institute Research Papers from Erasmus University Rotterdam, Erasmus School of Economics (ESE), Econometric Institute Contact information at EDIRC.
Bibliographic data for series maintained by RePub ( this e-mail address is bad, please contact ).