Minimax Estimation of the Mean Matrix of the Matrix Variate Normal Distribution under the Divergence Loss Function
Shokofeh Zinodiny (),
Sadegh Rezaei () and
Saralees Nadarajah ()
Additional contact information
Shokofeh Zinodiny: Amirkabir University of Technology - Iran
Sadegh Rezaei: Amirkabir University of Technology - Iran
Saralees Nadarajah: University of Manchester - UK
Statistica, 2017, vol. 77, issue 4, 369-384
Abstract:
The problem of estimating the mean matrix of a matrix-variate normal distribution with a covariance matrix is considered under two loss functions. We construct a class of empirical Bayes estimators which are better than the maximum likelihood estimator under the first loss function and hence show that the maximum likelihood estimator is inadmissible. We find a general class of minimax estimators. Also we give a class of estimators that improve on the maximum likelihood estimator under the second loss function and hence show that the maximum likelihood estimator is inadmissible.
Keywords: Empirical Bayes estimation; Matrix variate normal distribution; Mean matrix; Minimax estimation (search for similar items in EconPapers)
Date: 2017
References: Add references at CitEc
Citations: View citations in EconPapers (2)
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:bot:rivsta:v:77:y:2017:i:4:p:369-384
Access Statistics for this article
Statistica is currently edited by Department of Statistics, University of Bologna
More articles in Statistica from Department of Statistics, University of Bologna Contact information at EDIRC.
Bibliographic data for series maintained by Giovanna Galatà ().