当前位置: 代码迷 >> 计算机图书 >> 读书笔记 一 of Statistics :Moments and Moment Generating Functions (c.f. Statistical Inference by George Casella and Roger L. Berger)
  详细解决方案

读书笔记 一 of Statistics :Moments and Moment Generating Functions (c.f. Statistical Inference by George Casella and Roger L. Berger)

热度:459   发布时间:2016-04-29 11:23:25.0
读书笔记 1 of Statistics :Moments and Moment Generating Functions (c.f. Statistical Inference by George Casella and Roger L. Berger)

Part 1: Moments

Definition 1 For each integer $n$, the nth moment of $X$, $\mu_n^{'}$ is

\[\mu_{n}^{'} = EX^n.\]

The nth central moment of $X$, $\mu_n$, is 

\[ \mu_n = E(X-\mu)^n,\]

where $\mu=\mu_{1}^{'}=EX$.

 

Definition 2 The variance of a random variable $X$ $= Var X = E(X-EX)^2$.

The standard deviation of $X$ $=  \sqrt{Var X}$.

 

 

Part 2: Moment Generating Function (mgf)

The mgf can be used to generate moments. In practice, it is easier to compute moments directly than to use the mgf. However, the main use of the mgf is to help in characterizing a distribution. 

 

Defintion 3 Let $X$ be a random variable with cdf $F_X$. The moment generating function (mgf) of $X$, denoted by $M_X(t)$, is

\[M_{X}(t) = E e^{tX}, \]

provided that the expectation exists for $t$ in some neighborhood of $0$. If the expectation does not exist in a neighborhood of $0$, we say that the moment generating function does not exist.

 

Theorem 4 (mgf generates moments)

If $X$ has mgf $M_X(t)$, then 

\[E X^{n} = M_{X}^{(n)}(0),\]

where we define 

\[M_{X}^{(n)}(0) = \frac{d^n}{d t^{n}}M_X(t) |_{t=0}.\]

That is, the nth moment is equal to the nth derivative of $M_X(t)$ evaluated at $t=0$. 

 

Remark 5 If the mgf exists, it characterizes an infinite set of moments. However, the infinite set of moments does not uniquely determine a distribution function. If we pose some condition on the random variable, say it has bounded support, then it is true that the inifinite set of moments uniquely determine a distribution function. 

Remark 6 Existence of all moments is not equivalent to existence of the moment generating function. Actually, if the mgf exists in a neighborhood of 0, then the distribution is uniquely determined. An analogue is the analytic function in a neighborhood and the existence of derivatives of all orders. 

 

Theorem 7 

Let $F_X(t)$ and $F_Y(t)$ be two cdfs all of whose moments exist. 

a. If $X$ and $Y$ have bounded support, then $F_X(u)=F_Y(u)$ for all $u$ if and only if $E X^{r} = E Y^{r}$ for all integers $r = 0, 1, 2, \cdots$

b. If the moment generating functions exist and $M_X(t) = M_Y(t)$ for all $t$ in some neighborhood of $0$, then $F_X(u) = F_Y(u)$ for all u. 

 

Theorem 8 

Suppose $\{X_i\}, \quad i=1,2,3,\cdots$ is a sequence of random variables, each with mgf $M_{X_i}(t)$.

Furthermore, suppose that

\[\lim_{i\to \infty}M_{X_{i}}(t) = M_{X}(t), \]

for all $t$ in a neighborhood of 0, and $M_X(t)$  is an mgf.

Then there is a unique cdf $F_X$ whose moments are determined by $M_X(t)$ and , for all $x$ where $F_X(t)$ is continuous, we have

\[\lim_{i\to \infty}F_{X_{i}}(x) = F_{X}(x).\]

That is, convergence, for $|t|<h$, of mgfs to an mgf implies convergence of cdfs.