Algorithms for approximate linear regression design with application to a first order model with heteroscedasticity
N. Gaffke,
U. Graßhoff and
R. Schwabe
Computational Statistics & Data Analysis, 2014, vol. 71, issue C, 1113-1123
Abstract:
The basic structure of algorithms for numerical computation of optimal approximate linear regression designs is briefly summarized. First order methods are contrasted to second order methods. A first order method, also called a vertex direction method, uses a local linear approximation of the optimality criterion at the actual point. A second order method is a Newton or quasi-Newton method, employing a local quadratic approximation. Specific application is given to a multiple first order regression model on a cube with heteroscedasticity caused by random coefficients with known dispersion matrix. For a general (positive definite) dispersion matrix the algorithms work for moderate dimension of the cube. If the dispersion matrix is diagonal, a restriction to invariant designs is legal by equivariance of the model and the algorithms also work for large dimension.
Keywords: Information matrix; Optimality criterion; Steepest descent; quasi-Newton method; Efficiency; Invariant design (search for similar items in EconPapers)
Date: 2014
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0167947313002740
Full text for ScienceDirect subscribers only.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:csdana:v:71:y:2014:i:c:p:1113-1123
DOI: 10.1016/j.csda.2013.07.029
Access Statistics for this article
Computational Statistics & Data Analysis is currently edited by S.P. Azen
More articles in Computational Statistics & Data Analysis from Elsevier
Bibliographic data for series maintained by Catherine Liu ().