Bayesian information criterion: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
update ref format to full citation
en>طاها
Undid revision 596405815 by طاها (talk)
Line 1: Line 1:
In [[statistics|statistical data analysis]] the '''total sum of squares''' (TSS or SST) is a quantity that appears as part of a standard way of presenting results of such analyses. It is defined as being the sum, over all observations, of the squared differences of each observation from the overall [[mean]].<ref>Everitt, B.S. (2002) ''The Cambridge Dictionary of Statistics'', CUP, ISBN 0-521-81099-X</ref>
Greetings. The author's title is Phebe and she feels comfy when people use the full name. North Dakota is where me and my spouse reside. Since she was 18 she's been working as a receptionist but her promotion never arrives. His wife doesn't like it the way he does but what he really likes doing is to do aerobics and he's been performing it for quite a while.<br><br>My webpage :: [http://nuvem.tk/altergalactica/NumbersdbEstepvu nuvem.tk]
 
In [[statistics|statistical]] [[linear model]]s, (particularly in standard [[regression model]]s), the '''TSS''' is the [[sum]] of the [[square (algebra)|square]]s of the difference of the dependent variable and its [[mean]]:
 
:<math>\sum_{i=1}^{n}\left(y_{i}-\bar{y}\right)^2</math>
 
where <math>\bar{y}</math> is the mean.
 
For wide classes of linear models, the total sum of squares equals the [[explained sum of squares]] plus the [[residual sum of squares]]. For a proof of this in the multivariate OLS case, see [[Explained sum of squares#Partitioning in the general OLS model|partitioning in the general OLS model]].
 
In [[analysis of variance]] (ANOVA) the total sum of squares is the sum of the so-called "within-samples" sum of squares and "between-samples" sum of squares, i.e., partitioning of the sum of squares.
In [[multivariate analysis of variance]] (MANOVA) the following equation applies<ref name="MardiaK1979Multivariate">{{Cite book
| author = [[K. V. Mardia]], J. T. Kent and J. M. Bibby
| title = Multivariate Analysis
| publisher = [[Academic Press]]
| year = 1979
| isbn = 0-12-471252-5
}} Especially chapters 11 and 12.</ref>
:<math>\mathbf{T} = \mathbf{W} + \mathbf{B},</math>
where '''T''' is the total sum of squares and products (SSP) [[Matrix (mathematics)|matrix]], '''W''' is the within-samples SSP matrix and '''B''' is the between-samples SSP matrix.
Similar terminology may also be used in [[linear discriminant analysis]], where '''W''' and '''B''' are respectively referred to as the within-groups and between-groups SSP matrics.<ref name="MardiaK1979Multivariate"/>
 
==See also==
*[[Sum of squares (statistics)]]
*[[Lack-of-fit sum of squares]]
 
==References==
{{Reflist}}
 
[[Category:Regression analysis]]
[[Category:Least squares]]

Revision as of 00:57, 26 February 2014

Greetings. The author's title is Phebe and she feels comfy when people use the full name. North Dakota is where me and my spouse reside. Since she was 18 she's been working as a receptionist but her promotion never arrives. His wife doesn't like it the way he does but what he really likes doing is to do aerobics and he's been performing it for quite a while.

My webpage :: nuvem.tk