Wikipedia 10K Redux by Reagle from Starling archive. Bugs abound!!!

<-- Previous | Newer --> | Current: 983114670 Dick Beldin at Sun, 25 Feb 2001 15:24:30 +0000.


back to [[Statistical Theory]]

The '''Likelihood Principle''' asserts that the [[information]] in any [[Sample]] can be found, if at all, from the '''likelihood function''', that function of [[unknown parameters]] which specifies the probability of the sample observed.

Suppose, for example, that we have observed ''N'' independent flips of a coin which we regard as having a constant probabilty, ''p'', of falling heads up. The likelihood function is then the product of ''N'' factors, each of which is either ''p'' or ''1-p''. If we observe ''X'' heads and ''N-X'' tails, then the likelihood function is 

             X     n-X
     L(p) ~ p (1-p)      ie, proportional to the product.
No multiplicative constant of ''C(N,X)'' is included because only the part of the probability which involves the parameter, ''p'', is relevant. In particular, this principle suggests that it does not matter whether you started out planning to observe ''N'' trials or you just decided to stop on a whim. The issue of the '''likelihood principle''' is still controversial. ---- [[Dick Beldin]]