Mathematical definition
The concept is most general when defined as follows: a statistic T(X) is sufficient for underlying parameter θ precisely if the conditional probability distribution of the data X, given the statistic T(X), is independent of the parameter θ,[3] i.e.
or in shorthand
Fisher-Neyman factorization theorem
Fisher's factorization theorem or factorization criterion provides a convenient characterization of a sufficient statistic. If the probability density function is ƒθ(x), then T is sufficient for θ if and only if functions g and h can be found such that
i.e. the density ƒ can be factored into a product such that one factor, h, does not depend on θ and the other factor, which does depend on θ, depends on x only through T(x).
Let X1, X2, ..., Xn, denote a random sample from a distribution having the pdf f(x,θ) for γ < θ < δ. Let Y = u(X1, X2, ..., Xn) be a statistic whose pdf is g(y;θ). Then Y = u(X1, X2, ..., Xn) is a sufficient statistic for θ if and only if, for some function H,
First, suppose that
We shall make the transformation yi = ui(x1, x2, ..., xn), for i = 1, ..., n, having inverse functions xi = wi(y1, y2, ..., yn), for i = 1, ..., n, and Jacobian [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة]. Thus,
The left-hand member is the joint pdf g(y1, y2, ..., yn; θ) of Y1 = u1(X1, ..., Xn), ..., Yn = un(X1, ..., Xn). In the right-hand member, [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] is the pdf of Y1, so that [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] is the quotient of [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] and g1(y1;θ); that is, it is the conditional pdf [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] of [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] given Y1 = y1.
But [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة], and thus [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة], was given not to depend upon θ. Since θ was not introduced in the transformation and accordingly not in the Jacobian J, it follows that [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] does not depend upon θ and that Y1 is a sufficient statistics for θ.
The converse is proven by taking:
where [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] does not depend upon θ because Y2...Yn depend only upon X1...Xn which are independent on Θ when conditioned by Y1, a sufficient statistics by hypothesis. Now divide both members by the absolute value of the non-vanishing Jacobian J, and replace [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] by the functions [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] in [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة]. This yields
where J * is the Jacobian with [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] replaced by their value in terms [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة]. The left-hand member is necessarily the joint pdf [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] of [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة]. Since [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة], and thus [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة], does not depend upon θ, then
is a function that does not depend upon θ.
We use the shorthand notation to denote the joint probability of (X,T(X)) by fθ(x,t). Since T is a function of X, we have fθ(x,t) = fθ(x) and thus:
fθ(x) = fθ(x,t) = fθ | t(x)fθ(t)
with the last equality being true by the definition of conditional probability distributions. Thus fθ(x) = a(x)bθ(t) with a(x) = fθ | t(x) and b(x) = fθ(t).
Reciprocally, if fθ(x) = a(x)bθ(t), we have
With the first equality by the definition of pdf for multiple variables, the second by the remark above, the third by hypothesis, and the fourth because the summation is not over t.
Thus, the conditional probability distribution is:
With the first equality by definition of conditional probability
density, the second by the remark above, the third by the equality
proven above, and the fourth by simplification. This expression does
not depend on θ and thus T is a sufficient statistic.[4]
Minimal sufficiency
A sufficient statistic is minimal sufficient if it can be represented as a function of any other sufficient statistic. In other words, S(X) is minimal sufficient if and only if
Intuitively, a minimal sufficient statistic most efficiently captures all possible information about the parameter θ.
A useful characterization of minimal sufficiency is that when the density fθ exists, S(X) is minimal sufficient if and only if
[ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] is independent of θ :[ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] S(x) = S(y)
This follows as a direct consequence from the Fisher's factorization theorem stated above.
A sufficient and complete statistic
is necessarily minimal sufficient. A complete statistic need not exist,
but there is always a minimal sufficient statistic. For example, the
collection of likelihood ratios [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] is a minimal sufficient statistic if P(X | θ) is discrete or has a density function.
Examples
Bernoulli distribution
If X1, ...., Xn are independent Bernoulli-distributed random variables with expected value p, then the sum T(X) = X1 + ... + Xn is a sufficient statistic for pXi = 1 and 'failure' to Xi = 0; so T is the total number of successes) (here 'success' corresponds to
This is seen by considering the joint probability distribution:
Because the observations are independent, this can be written as
and, collecting powers of p and 1 − p, gives
which satisfies the factorization criterion, with h(x)=1 being just a constant.
Note the crucial feature: the unknown parameter p interacts with the data x only via the statistic T(x) = Σ xi.
Uniform distribution
If X1, ...., Xn are independent and uniformly distributed on the interval [0,θ], then T(X) = max(X1, ...., Xn ) is sufficient for θ.
To see this, consider the joint probability distribution:
Because the observations are independent, this can be written as
where H(x) is the Heaviside step function. This may be written as
which can be viewed as a function of only θ and maxi(Xi) = T(X). This shows that the factorization criterion is satisfied, again where h(x)=1 is constant (Is this correct?). Note that the parameter θ interacts with the data only through the data's maximum.
Poisson distribution
If X1, ...., Xn are independent and have a Poisson distribution with parameter λ, then the sum T(X) = X1 + ... + Xn is a sufficient statistic for λ.
To see this, consider the joint probability distribution:
Because the observations are independent, this can be written as
which may be written as
which shows that the factorization criterion is satisfied, where h(x) is the reciprocal of the product of the factorials. Note the parameter λ interacts with the data only through its sum T(X).
The concept is most general when defined as follows: a statistic T(X) is sufficient for underlying parameter θ precisely if the conditional probability distribution of the data X, given the statistic T(X), is independent of the parameter θ,[3] i.e.
or in shorthand
Fisher-Neyman factorization theorem
Fisher's factorization theorem or factorization criterion provides a convenient characterization of a sufficient statistic. If the probability density function is ƒθ(x), then T is sufficient for θ if and only if functions g and h can be found such that
i.e. the density ƒ can be factored into a product such that one factor, h, does not depend on θ and the other factor, which does depend on θ, depends on x only through T(x).
Let X1, X2, ..., Xn, denote a random sample from a distribution having the pdf f(x,θ) for γ < θ < δ. Let Y = u(X1, X2, ..., Xn) be a statistic whose pdf is g(y;θ). Then Y = u(X1, X2, ..., Xn) is a sufficient statistic for θ if and only if, for some function H,
First, suppose that
We shall make the transformation yi = ui(x1, x2, ..., xn), for i = 1, ..., n, having inverse functions xi = wi(y1, y2, ..., yn), for i = 1, ..., n, and Jacobian [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة]. Thus,
The left-hand member is the joint pdf g(y1, y2, ..., yn; θ) of Y1 = u1(X1, ..., Xn), ..., Yn = un(X1, ..., Xn). In the right-hand member, [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] is the pdf of Y1, so that [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] is the quotient of [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] and g1(y1;θ); that is, it is the conditional pdf [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] of [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] given Y1 = y1.
But [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة], and thus [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة], was given not to depend upon θ. Since θ was not introduced in the transformation and accordingly not in the Jacobian J, it follows that [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] does not depend upon θ and that Y1 is a sufficient statistics for θ.
The converse is proven by taking:
where [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] does not depend upon θ because Y2...Yn depend only upon X1...Xn which are independent on Θ when conditioned by Y1, a sufficient statistics by hypothesis. Now divide both members by the absolute value of the non-vanishing Jacobian J, and replace [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] by the functions [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] in [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة]. This yields
where J * is the Jacobian with [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] replaced by their value in terms [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة]. The left-hand member is necessarily the joint pdf [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] of [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة]. Since [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة], and thus [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة], does not depend upon θ, then
is a function that does not depend upon θ.
We use the shorthand notation to denote the joint probability of (X,T(X)) by fθ(x,t). Since T is a function of X, we have fθ(x,t) = fθ(x) and thus:
fθ(x) = fθ(x,t) = fθ | t(x)fθ(t)
with the last equality being true by the definition of conditional probability distributions. Thus fθ(x) = a(x)bθ(t) with a(x) = fθ | t(x) and b(x) = fθ(t).
Reciprocally, if fθ(x) = a(x)bθ(t), we have
With the first equality by the definition of pdf for multiple variables, the second by the remark above, the third by hypothesis, and the fourth because the summation is not over t.
Thus, the conditional probability distribution is:
With the first equality by definition of conditional probability
density, the second by the remark above, the third by the equality
proven above, and the fourth by simplification. This expression does
not depend on θ and thus T is a sufficient statistic.[4]
Minimal sufficiency
A sufficient statistic is minimal sufficient if it can be represented as a function of any other sufficient statistic. In other words, S(X) is minimal sufficient if and only if
1-S(X) is sufficient, and
2-if T(X) is sufficient, then there exists a function f such that S(X) = f(T(X)).
2-if T(X) is sufficient, then there exists a function f such that S(X) = f(T(X)).
Intuitively, a minimal sufficient statistic most efficiently captures all possible information about the parameter θ.
A useful characterization of minimal sufficiency is that when the density fθ exists, S(X) is minimal sufficient if and only if
[ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] is independent of θ :[ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] S(x) = S(y)
This follows as a direct consequence from the Fisher's factorization theorem stated above.
A sufficient and complete statistic
is necessarily minimal sufficient. A complete statistic need not exist,
but there is always a minimal sufficient statistic. For example, the
collection of likelihood ratios [ندعوك للتسجيل في المنتدى أو التعريف بنفسك لمعاينة هذه الصورة] is a minimal sufficient statistic if P(X | θ) is discrete or has a density function.
Examples
Bernoulli distribution
If X1, ...., Xn are independent Bernoulli-distributed random variables with expected value p, then the sum T(X) = X1 + ... + Xn is a sufficient statistic for pXi = 1 and 'failure' to Xi = 0; so T is the total number of successes) (here 'success' corresponds to
This is seen by considering the joint probability distribution:
Because the observations are independent, this can be written as
and, collecting powers of p and 1 − p, gives
which satisfies the factorization criterion, with h(x)=1 being just a constant.
Note the crucial feature: the unknown parameter p interacts with the data x only via the statistic T(x) = Σ xi.
Uniform distribution
If X1, ...., Xn are independent and uniformly distributed on the interval [0,θ], then T(X) = max(X1, ...., Xn ) is sufficient for θ.
To see this, consider the joint probability distribution:
Because the observations are independent, this can be written as
where H(x) is the Heaviside step function. This may be written as
which can be viewed as a function of only θ and maxi(Xi) = T(X). This shows that the factorization criterion is satisfied, again where h(x)=1 is constant (Is this correct?). Note that the parameter θ interacts with the data only through the data's maximum.
Poisson distribution
If X1, ...., Xn are independent and have a Poisson distribution with parameter λ, then the sum T(X) = X1 + ... + Xn is a sufficient statistic for λ.
To see this, consider the joint probability distribution:
Because the observations are independent, this can be written as
which may be written as
which shows that the factorization criterion is satisfied, where h(x) is the reciprocal of the product of the factorials. Note the parameter λ interacts with the data only through its sum T(X).