Fungsi Pembangkit Moment & Teorema Chebisev
-
Upload
azizah-noor -
Category
Documents
-
view
533 -
download
3
description
Transcript of Fungsi Pembangkit Moment & Teorema Chebisev
BY:
GROUP 4
NOOR AZIZAHRATNAH KURNIATIZULFAHMI SOFYAN
JALIL SETIAWAN
MATHEMATICS DEPARTEMENT
MATHEMATICS AND SCIENCE FACULTYMAKASSAR STATE
UNIVERSITY
MO
MEN
T
GEN
ER
ATIN
G
FU
NC
TIO
N A
ND
C
HEB
YS
HEV
’S
MATHEMATICAL STATISTICGROUP TASK
PREFACE
Author thanks deliver to the Almighty God, because the guidance blessed
and so I can finish this paper entitled " MOMENT GENERATING FUNCTION
AND CHEBYSHEV’S THEOREM".
In writing this paper, the author is not a little experience obstacles, but
eventually these obstacles can be overcome with the help of various parties.
Therefore, the author does not forget to thank the people who have given a great
contribution in resolving this paper.
The author is fully aware that in the preparation of this paper there are
many flaws and is far be pushed for perfection. It was, the writer was expecting
criticism and constructive suggestions in order for the perfection of this paper.
Finally, with humility, I hope this paper can benefit the reader and the writer
himself in particular.
Makassar, November 4, 2010
Writer
|
CONTENTS
Page
TITLE PAGE........................................................................................................i
PREFACE.............................................................................................................ii
CONTENTS..........................................................................................................iii
CHAPTER I INTRODUCTION
A. Background...............................................................................................1
B. Problem Statement....................................................................................1
C. Objective of Paper.....................................................................................1
D. Significant of Paper...................................................................................1
CHAPTER II MOMENT GENERATING FUNCTION AND CHEBYSHEV’S
THEOREM
A. Moment Generating Function...................................................................2
B. Chebyshev’s Theorem...............................................................................5
CHAPTER III CLOSING
A. Conclusion ................................................................................................8
B. Suggestion.................................................................................................9
REFERENCES......................................................................................................10
|
CHAPTER I
INTRODUCTION
A. Background
Mathematics expectation concept (expected hope in mathematics) in
statistic that give helpfulness. Beside used it to development in advanced and
application statistic, also can in other , also as basic concept to defined or make
measurements in statistic, like mean, variance, covariance , coefisien, and
correlation.
Example concepts expectation is mean of random variable, variance and
covariance, chebyshev, theorem, Bienaime’s theorem, and moment generating
function. From that some concepts, in this paper writer just restricted it to moment
generating function and Chebyshev’s theorem. That will explain in next chapter
or in explanation chapter.
B. Problem Statement
According to the background before, writer can conclude the problem from
this paper is:
1. What is moment generating function?
2. What is a Chebyshev’s Theorem?
C. Objective of Paper
According to the problem statement before, writer can conclude the purpose
from this paper is:
1. To know the moment generating function.
2. To know a Chebyshev’s Theorem.
|
CHAPTER II
MOMENT GENERATING FUNCTION AND CHEBYSHEV’S
THEOREM
A. Moment Generating Function
The moment generating function of X is defined by
Mx(t) = E (etx)
That is
Mx(t) = ∑i=1
n
etxi f (xi) = ∑ etx f(x) (discrete variable)
Mx(t) = ∫−x
x
etx f (x) dx (continuous variable)
We can show that the Taylor series expansion is [problem 3.15(a)]
Mx(t) = 1 + µt + µ’ t 2
2! + ………. + µ
’ tr
r ! + ……..
Since the coefficients in this expansion enable us to find the moments, the
reason for the name moment generating function is apparent. From te expansion
we can show that [Problem 3.15(b)]
µ’r =
dr
d t r Mx (t) t= 0
i.e. µ’r is the rth derivative of Mx(t) evaluated at t = 0. Where n confusion can
result we often write M(t) instead of Mx(t).
Some Theorems on Moment Generating Functions
|
Theorem 3 – 8 : if Mx(t) is the moment generating function of the random
variable X and a and b (b ≠ 0 ) are constants, then the moment generating function
of ( X + a)/b is
M (x+a) / b (t) = eat/b Mx ( tb )
Theorem 3 – 9 : if X and Y are independent random variables having
moment generating function Mx(t) and My(t) respectively, then
M x + y (t) = Mx(t) My(t)
Generalizations of theorem 3 – 9 to more than two independent random
variables are easily made. In words, the moment generating function of a sum of
independent random variables is equal to the product of their moment generating
functions.
Theorem 3 -10 (Uniqueness theorem) : Suppose that X and Y are random
variables having moment generating function Mx(t) and My(t) respectively. Hen X
and Y have the same probability distribution if and only if Mx(t) = My(t)
identically.
Characteristic Function
If we let t=iw, where i is the imaginary unit, in the moment generating
functionwe obtain an important function called the characteristic function. We
denote this by
Φx (ω)=M X ( iω )=E (e iωX )
It follows that
ΦX (ω)=∑j=1
n
e iωx f ( x j )=∑ eiωx f (x) (discrete variable)
ΦX (ω)=∫−∞
∞
eiωx f ( x ) dx (continuous variable)
|
The corresponding result become
ΦX (ω)=1+ iμω−μ'2
ω2
2 !+…+ir μ'
rωr
r !+…
Where
μ 'r= (−1 )r ir dr
dωr Φ X (ω )|ω=0
When no confusion can result we often write Φ (ω) instead of ΦX (ω).
Theorems for characteristic function corresponding to Theorem 3-8, 3-9, and
3-10 are as follows
Theorem 3:
If ΦX (ω) is the characteristic function of random variable X and a and
b (b ≠ 0) are constants, then the characteristic function of X+a
b is
Φ( X+a)/b(ω)=eaiω /bΦ X( ωb )
Theorem 2:
If X and Y are independent random variables having characteristic functions
ΦX (ω)and ΦY (ω) respectively, then
ΦX+Y (ω )=ΦX (ω)ΦY (ω )
More generally, the characteristic function of a sum of independent random
variables is equal to the product of their characteristic functions.
Theorem 3-13 (uniqueness theorem)
|
Suppose that X and Y are random variables having characteristic functions
ΦX (ω) and ΦY (ω) respectively. Then X and Y have the same probability
distribution if and only if ΦX (ω)¿ΦY (ω) identically.
An important reason for introducing the characteristic function is that (37)
represents the Fourier transform of the density function f (x). From the theory of
Fourier transforms we can easily determine the density function from the
characteristic function. In fact,
f ( x )= 12π
∫−∞
∞
e−iωxΦ X(ω)dω
Which is often called an inversion formula or inverse Fourier transform. In
a similar manner we can show in the discrete case that the probability function
f (x) can be obtained from (36) use the Fouerier series, which is the analog of the
Fourier integral for the discrete case. See problem 3.39
Another reason for using the characteristic function is that it always exists
whereas the moment generating function may not exist.
B. Chebyshev’s Theorem
An important in theorem in probability and statistics which reveals a general
property of discrete or continuous random variables having finite mean and
variance is known under the name of Chebyshev’s Theorem or Chebyshev’s
Inequality. The Russian mathematician P.L. Chebyshev (1821 – 1894) discovered
that the fraction of the area between any two values symmetric about the mean is
related to the standard deviation. Since the area under a probability distribution
curve or in a probability histogram adds to 1, the area between any two numbers is
the probability of the random variable assuming a value between these numbers.
The following theorem, due to Chebyshev, gives a conservative estimate of
the probability that a random variable assumes a value within k standard
|
deviations of its mean for any real number k. We shall provide the proof only for
the continuous case, leaving the discrete case as an exercise.
Theorem: the probability that any random variable X having mean µ and variance
0 < σ2 < ∞. Then if P is any positive number,
P(|X−μ|≥ ϵ )≰1σ2
ϵ2
Or, with ϵ 2=kσ,
P(|X−μ|≥ kσ )≰1
k2
Or the value of k standard deviation of the mean is at least 1−1 /k2, that
is
P(μ−kσ< X<μ+kσ )≥1−1/k2
Proof:
By our previous definition of the various of X, we can write
σ 2=E [ ( X−μ )2 ]=∫−∞
∞
( X−μ )2 f ( x ) dx
¿ ∫−∞
μ−kσ
( X−μ )2f ( x ) dx+ ∫μ−kσ
μ+ kσ
( X−μ )2 f ( x ) dx+ ∫μ+ kσ
∞
( X−μ )2 f ( x ) dx
≥ ∫−∞
μ−kσ
( X−μ )2 f ( x ) dx+¿ ∫μ+ kσ
∞
( X−μ )2 f ( x ) dx¿
Since the second of the three integrals is nonnegative. Now, since |x−μ|≥ kσ
wherever x≥ μ+kσ or x≰ μ−kσ , we have ( x−μ )2 ≥ k2 σ2 in both remaining
integrals. It follows that
|
σ 2≥ ∫−∞
μ−kσ
k2 σ2 f (x ) dx+¿ ∫μ+kσ
∞
k 2σ 2 f ( x ) dx¿
And that
∫−∞
μ−kσ
f ( x ) dx+¿ ∫μ+ kσ
∞
f ( x )dx ≰1k 2 ¿
Hence
P (μ−kσ <X<μ+kσ )= ∫μ−kσ
μ+kσ
f ( x ) dx≥ 1− 1k 2
And the theorem is established.
For k = 2 the theorem states that the random variable X has a probability of
at least 1−1
22=3/4 of falling with two standard deviations of the mean. That is,
three-fourths or more of the observations of any distribution lie in the interval
μ ± 2σ . Similarly, the theorem says that at least eight-ninths of the observations of
any distribution fall in the interval μ ±3 σ.
Example:
A random variable X has a mean μ=8, a variance σ 2=9, and unknown
probability distribution. Find
a. P (−4<X<20 )b. P (|X−8|≥ 6 )
Solution:
a. P (−4<X<20 )=P [8−( 4 ) (3 )< X<8+( 4 ) (3 ) ] ≥ 1516
b. P (|X−8|≥ 6 )=1−P (|X−8|<6 )=1−P (−6<X<−8<6 )
¿1−P [8−(2 ) (3 )< X<8+ (2 )(3)]≰ 14
|
Chebyshev’s theorem holds for any distribution of observations and for this
reason, the results are usually weak. The value given by the theorem is a lower
bound only. That is, we know that the probability of a random variable falling
within two standard deviations of the mean can be no less than 3/4 , but we never
know how much more it might actually be. Only when the probability distribution
is known can we determine exact probabilities. For this reason we call the
theorem a distribution-free result. When specific distributions are assumed as in
future chapters, the result will be les conservative. The use of Chebyshev’s
Theorem is relegated where the form of the distribution is unknown.
CHAPTER III
CLOSING
A. Conclusion
Moment Generating Function
The moment generating function of X is defined by
Mx(t) = E (etx)
That is
Mx(t) = ∑i=1
n
etxi f (xi) = ∑ etx f(x) (discrete variable)
Mx(t) = ∫−x
x
etx f (x) dx (continuous variable)
Chebyshev’s Theorem
the probability that any random variable X having mean µ and variance
0 < σ2 < ∞. Then if P is any positive number,
|
P(|X−μ|≥ ϵ )≰1σ2
ϵ2
Or, with ϵ 2=kσ,
P(|X−μ|≥ kσ )≰1
k2
Or the value of k standard deviation of the mean is at least 1−1 /k2, that
is
P(μ−kσ< X<μ+kσ )≥1−1/k2
B. Suggestion
In our group hope, it will be better if next time we can get the book that will
be literature is better than our group use in this time. Then, time to work this task
is longer than time that we use today. And every member of group give opinion in
the paper project.
|
REFFERENCES
Spiegel, Murray R. Probability and Statistic. Schaum’s Outline Series.
Tiro, Muhammad Arif. 2008. Pengantar Teori Peluang. Makassar: Andira
Publisher.
Walpole, Ronald.E G. E. P. 2007. Probability and Statistic for Engineering &
Scientists. Pearson International Edition.
|