Linear Models
W. D. Brinda
Additional contact information
W. D. Brinda: Yale University, Statistics and Data Science
Chapter Chapter 4 in Visualizing Linear Models, 2021, pp 81-102 from Springer
Abstract:
Abstract In Chap. 2 , we learned how to do least-squares regression when the set of possible prediction vectors comprise a subspace: the least-squares prediction vector is the orthogonal projection of the data onto that subspace. Now we will formulate probabilistic models for which the possible expectation vectors of the response variable comprise a subspace. Each type of model we discuss has a counterpart in our earlier study of regression, and we will reanalyze the least-squares predictions and coefficients in the context of the model. To make it easy for the reader to compare the material from the two chapters, their structures perfectly parallel each other.
Date: 2021
References: Add references at CitEc
Citations:
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:sprchp:978-3-030-64167-2_4
Ordering information: This item can be ordered from
http://www.springer.com/9783030641672
DOI: 10.1007/978-3-030-64167-2_4
Access Statistics for this chapter
More chapters in Springer Books from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().