J u m p t o c o n t e n t
M a i n m e n u
M a i n m e n u
N a v i g a t i o n
● M a i n p a g e
● C o n t e n t s
● C u r r e n t e v e n t s
● R a n d o m a r t i c l e
● A b o u t W i k i p e d i a
● C o n t a c t u s
● D o n a t e
C o n t r i b u t e
● H e l p
● L e a r n t o e d i t
● C o m m u n i t y p o r t a l
● R e c e n t c h a n g e s
● U p l o a d f i l e
S e a r c h
Search
A p p e a r a n c e
● C r e a t e a c c o u n t
● L o g i n
P e r s o n a l t o o l s
● C r e a t e a c c o u n t
● L o g i n
P a g e s f o r l o g g e d o u t e d i t o r s l e a r n m o r e
● C o n t r i b u t i o n s
● T a l k
( T o p )
1
S t a t e m e n t
2
C o n d i t i o n s o f c o n v e r g e n c e
3
E x a m p l e s
T o g g l e E x a m p l e s s u b s e c t i o n
3 . 1
F r é c h e t d i s t r i b u t i o n
3 . 2
G u m b e l d i s t r i b u t i o n
3 . 3
W e i b u l l d i s t r i b u t i o n
4
S e e a l s o
5
R e f e r e n c e s
6
F u r t h e r r e a d i n g
T o g g l e t h e t a b l e o f c o n t e n t s
F i s h e r – T i p p e t t – G n e d e n k o t h e o r e m
4 l a n g u a g e s
● E s p a ñ o l
● F r a n ç a i s
● P o r t u g u ê s
● Р у с с к и й
E d i t l i n k s
● A r t i c l e
● T a l k
E n g l i s h
● R e a d
● E d i t
● V i e w h i s t o r y
T o o l s
T o o l s
A c t i o n s
● R e a d
● E d i t
● V i e w h i s t o r y
G e n e r a l
● W h a t l i n k s h e r e
● R e l a t e d c h a n g e s
● U p l o a d f i l e
● S p e c i a l p a g e s
● P e r m a n e n t l i n k
● P a g e i n f o r m a t i o n
● C i t e t h i s p a g e
● G e t s h o r t e n e d U R L
● D o w n l o a d Q R c o d e
● W i k i d a t a i t e m
P r i n t / e x p o r t
● D o w n l o a d a s P D F
● P r i n t a b l e v e r s i o n
A p p e a r a n c e
F r o m W i k i p e d i a , t h e f r e e e n c y c l o p e d i a
The role of the extremal types theorem for maxima is similar to that of central limit theorem for averages, except that the central limit theorem applies to the average of a sample from any distribution with finite variance, while the Fisher–Tippet–Gnedenko theorem only states that if the distribution of a normalized maximum converges, then the limit has to be one of a particular class of distributions. It does not state that the distribution of the normalized maximum does converge.
Statement [ edit ]
Let
X
1
,
X
2
,
…
,
X
n
{\displaystyle \ X_{1},X_{2},\ldots ,X_{n}\ }
be an n -sized sample of independent and identically-distributed random variables , each of whose cumulative distribution function is
F
.
{\displaystyle \ F~.}
Suppose that there exist two sequences of real numbers
a
n
>
0
{\displaystyle \ a_{n}>0\ }
and
b
n
∈
R
{\displaystyle \ b_{n}\in \mathbb {R} \ }
such that the following limits converge to a non-degenerate distribution function:
lim
n
→
∞
P
{
max
{
X
1
,
…
,
X
n
}
−
b
n
a
n
≤
x
}
=
G
(
x
)
,
{\displaystyle \lim _{n\to \infty }{\boldsymbol {\mathcal {P}}}\left\{{\frac {\ \max\{X_{1},\dots ,X_{n}\}-b_{n}\ }{a_{n}}}\leq x\ \right\}=G(x )\ ,}
or equivalently:
lim
n
→
∞
(
F
(
a
n
x
+
b
n
)
)
n
=
G
(
x
)
.
{\displaystyle \lim _{n\to \infty }{\Bigl (}\ F\left(\ a_{n}\ x+b_{n}\ \right){\Bigr )}^{n}=G(x )~.}
In such circumstances, the limiting distribution
G
{\displaystyle \ G\ }
belongs to either the Gumbel , the Fréchet , or the Weibull distribution family .[6]
In other words, if the limit above converges, then up to a linear change of coordinates
G
(
x
)
{\displaystyle G(x )}
will assume either the form:[7]
G
γ
(
x
)
=
exp
(
−
(
1
+
γ
x
)
−
1
/
γ
)
{\displaystyle G_{\gamma }(x )=\exp \left(-{\Bigl (}1+\gamma \ x{\Bigr )}^{-1/\gamma }\right)\quad }
for
γ
≠
0
,
{\displaystyle \quad \gamma \neq 0\ ,}
with the non-zero parameter
γ
{\displaystyle \ \gamma \ }
also satisfying
1
+
γ
x
>
0
{\displaystyle \ 1+\gamma \ x>0\ }
for every
x
{\displaystyle \ x\ }
value supported by
F
{\displaystyle \ F\ }
(for all values
x
{\displaystyle \ x\ }
for which
F
(
x
)
≠
0
{\displaystyle \ F(x )\neq 0\ }
). Otherwise it has the form:
G
0
(
x
)
=
exp
(
−
exp
(
−
x
)
)
{\displaystyle G_{0}(x )=\exp {\bigl (}\ -\exp(-x)\ {\bigr )}\quad }
for
γ
=
0
.
{\displaystyle \quad \gamma =0~.}
This is the cumulative distribution function of the generalized extreme value distribution (GEV) with extreme value index
γ
.
{\displaystyle \ \gamma ~.\ }
The GEV distribution groups the Gumbel, Fréchet, and Weibull distributions into a single composite form.
Conditions of convergence [ edit ]
The Fisher–Tippett–Gnedenko theorem is a statement about the convergence of the limiting distribution
G
(
x
)
,
{\displaystyle \ G(x )\ ,}
above. The study of conditions for convergence of
G
{\displaystyle \ G\ }
to particular cases of the generalized extreme value distribution began with Mises (1936)[3] [5] [4] and was further developed by Gnedenko (1943).[5]
Let
F
{\displaystyle \ F\ }
be the distribution function of
X
,
{\displaystyle \ X\ ,}
and
X
1
,
…
,
X
n
{\displaystyle \ X_{1},\dots ,X_{n}\ }
be some i.i.d. sample thereof.
Also let
x
m
a
x
{\displaystyle \ x_{\mathsf {max}}\ }
be the population maximum:
x
m
a
x
≡
sup
{
x
∣
F
(
x
)
<
1
}
.
{\displaystyle \ x_{\mathsf {max}}\equiv \sup \ \{\ x\ \mid \ F(x )<1\ \}~.\ }
The limiting distribution of the normalized sample maximum, given by
G
{\displaystyle G}
above, will then be:[7]
Fréchet distribution
(
γ
>
0
)
{\displaystyle \ \left(\ \gamma >0\ \right)}
For strictly positive
γ
>
0
,
{\displaystyle \ \gamma >0\ ,}
the limiting distribution converges if and only if
x
m
a
x
=
∞
{\displaystyle \ x_{\mathsf {max}}=\infty \ }
and
lim
t
→
∞
1
−
F
(
u
t
)
1
−
F
(
t
)
=
u
(
−
1
γ
)
{\displaystyle \ \lim _{t\rightarrow \infty }{\frac {\ 1-F(u\ t)\ }{1-F(t )}}=u^{\left({\tfrac {-1~}{\gamma }}\right)}\ }
for all
u
>
0
.
{\displaystyle \ u>0~.}
In this case, possible sequences that will satisfy the theorem conditions are
b
n
=
0
{\displaystyle b_{n}=0}
and
a
n
=
F
−
1
(
1
−
1
n
)
.
{\displaystyle \ a_{n}={F^{-1}}\!\!\left(1-{\tfrac {1}{\ n\ }}\right)~.}
Strictly positive
γ
{\displaystyle \ \gamma \ }
corresponds to what is called a heavy tailed distribution.
Gumbel distribution
(
γ
=
0
)
{\displaystyle \ \left(\ \gamma =0\ \right)}
For trivial
γ
=
0
,
{\displaystyle \ \gamma =0\ ,}
and with
x
m
a
x
{\displaystyle \ x_{\mathsf {max}}\ }
either finite or infinite, the limiting distribution converges if and only if
lim
t
→
x
m
a
x
1
−
F
(
t
+
u
g
~
(
t
)
)
1
−
F
(
t
)
=
e
−
u
{\displaystyle \ \lim _{t\rightarrow x_{\mathsf {max}}}{\frac {\ 1-F{\bigl (}\ t+u\ {\tilde {g}}(t )\ {\bigr )}\ }{1-F(t )}}=e^{-u}\ }
for all
u
>
0
{\displaystyle \ u>0\ }
with
g
~
(
t
)
≡
∫
t
x
m
a
x
(
1
−
F
(
s
)
)
d
s
1
−
F
(
t
)
.
{\displaystyle \ {\tilde {g}}(t )\equiv {\frac {\ \int _{t}^{x_{\mathsf {max}}}{\Bigl (}\ 1-F(s )\ {\Bigr )}\ \mathrm {d} \ s\ }{1-F(t )}}~.}
Possible sequences here are
b
n
=
F
−
1
(
1
−
1
n
)
{\displaystyle \ b_{n}={F^{-1}}\!\!\left(\ 1-{\tfrac {1}{\ n\ }}\ \right)\ }
and
a
n
=
g
~
(
F
−
1
(
1
−
1
n
)
)
.
{\displaystyle \ a_{n}={\tilde {g}}{\Bigl (}\;{F^{-1}}\!\!\left(\ 1-{\tfrac {1}{\ n\ }}\ \right)\;{\Bigr )}~.}
Weibull distribution
(
γ
<
0
)
{\displaystyle \ \left(\ \gamma <0\ \right)}
For strictly negative
γ
<
0
{\displaystyle \ \gamma <0\ }
the limiting distribution converges if and only if
x
m
a
x
<
∞
{\displaystyle \ x_{\mathsf {max}}\ <\infty \quad }
(is finite)
and
lim
t
→
0
+
1
−
F
(
x
m
a
x
−
u
t
)
1
−
F
(
x
m
a
x
−
t
)
=
u
(
−
1
γ
)
{\displaystyle \ \lim _{t\rightarrow 0^{+}}{\frac {\ 1-F\!\left(\ x_{\mathsf {max}}-u\ t\ \right)\ }{1-F(\ x_{\mathsf {max}}-t\ )}}=u^{\left({\tfrac {-1~}{\ \gamma \ }}\right)}\ }
for all
u
>
0
.
{\displaystyle \ u>0~.}
Note that for this case the exponential term
−
1
γ
{\displaystyle \ {\tfrac {-1~}{\ \gamma \ }}\ }
is strictly positive, since
γ
{\displaystyle \ \gamma \ }
is strictly negative.
Possible sequences here are
b
n
=
x
m
a
x
{\displaystyle \ b_{n}=x_{\mathsf {max}}\ }
and
a
n
=
x
m
a
x
−
F
−
1
(
1
−
1
n
)
.
{\displaystyle \ a_{n}=x_{\mathsf {max}}-{F^{-1}}\!\!\left(\ 1-{\frac {1}{\ n\ }}\ \right)~.}
Note that the second formula (the Gumbel distribution) is the limit of the first (the Fréchet distribution) as
γ
{\displaystyle \ \gamma \ }
goes to zero.
Examples [ edit ]
Fréchet distribution [ edit ]
The Cauchy distribution 's density function is:
f
(
x
)
=
1
π
2
+
x
2
,
{\displaystyle f(x )={\frac {1}{\ \pi ^{2}+x^{2}\ }}\ ,}
and its cumulative distribution function is:
F
(
x
)
=
1
2
+
1
π
arctan
(
x
π
)
.
{\displaystyle F(x )={\frac {\ 1\ }{2}}+{\frac {1}{\ \pi \ }}\arctan \left({\frac {x}{\ \pi \ }}\right)~.}
A little bit of calculus show that the right tail's cumulative distribution
1
−
F
(
x
)
{\displaystyle \ 1-F(x )\ }
is asymptotic to
1
x
,
{\displaystyle \ {\frac {1}{\ x\ }}\ ,}
or
ln
F
(
x
)
→
−
1
x
a
s
x
→
∞
,
{\displaystyle \ln F(x )\rightarrow {\frac {-1~}{\ x\ }}\quad {\mathsf {~as~}}\quad x\rightarrow \infty \ ,}
so we have
ln
(
F
(
x
)
n
)
=
n
ln
F
(
x
)
∼
−
−
n
x
.
{\displaystyle \ln \left(\ F(x )^{n}\ \right)=n\ \ln F(x )\sim -{\frac {-n~}{\ x\ }}~.}
Thus we have
F
(
x
)
n
≈
exp
(
−
n
x
)
{\displaystyle F(x )^{n}\approx \exp \left({\frac {-n~}{\ x\ }}\right)}
and letting
u
≡
x
n
−
1
{\displaystyle \ u\equiv {\frac {x}{\ n\ }}-1\ }
(and skipping some explanation)
lim
n
→
∞
(
F
(
n
u
+
n
)
n
)
=
exp
(
−
1
1
+
u
)
=
G
1
(
u
)
{\displaystyle \lim _{n\to \infty }{\Bigl (}\ F(n\ u+n)^{n}\ {\Bigr )}=\exp \left({\tfrac {-1~}{\ 1+u\ }}\right)=G_{1}(u )\ }
for any
u
.
{\displaystyle \ u~.}
Gumbel distribution [ edit ]
Let us take the normal distribution with cumulative distribution function
F
(
x
)
=
1
2
erfc
(
−
x
2
)
.
{\displaystyle F(x )={\frac {1}{2}}\operatorname {erfc} \left({\frac {-x~}{\ {\sqrt {2\ }}\ }}\right)~.}
We have
ln
F
(
x
)
→
−
exp
(
−
1
2
x
2
)
2
π
x
a
s
x
→
∞
{\displaystyle \ln F(x )\rightarrow -{\frac {\ \exp \left(-{\tfrac {1}{2}}x^{2}\right)\ }{{\sqrt {2\pi \ }}\ x}}\quad {\mathsf {~as~}}\quad x\rightarrow \infty }
and thus
ln
(
F
(
x
)
n
)
=
n
ln
F
(
x
)
→
−
n
exp
(
−
1
2
x
2
)
2
π
x
a
s
x
→
∞
.
{\displaystyle \ln \left(\ F(x )^{n}\ \right)=n\ln F(x )\rightarrow -{\frac {\ n\exp \left(-{\tfrac {1}{2}}x^{2}\right)\ }{{\sqrt {2\pi \ }}\ x}}\quad {\mathsf {~as~}}\quad x\rightarrow \infty ~.}
Hence we have
F
(
x
)
n
≈
exp
(
−
n
exp
(
−
1
2
x
2
)
2
π
x
)
.
{\displaystyle F(x )^{n}\approx \exp \left(-\ {\frac {\ n\ \exp \left(-{\tfrac {1}{2}}x^{2}\right)\ }{\ {\sqrt {2\pi \ }}\ x\ }}\right)~.}
If we define
c
n
{\displaystyle \ c_{n}\ }
as the value that exactly satisfies
n
exp
(
−
1
2
c
n
2
)
2
π
c
n
=
1
,
{\displaystyle {\frac {\ n\exp \left(-\ {\tfrac {1}{2}}c_{n}^{2}\right)\ }{\ {\sqrt {2\pi \ }}\ c_{n}\ }}=1\ ,}
then around
x
=
c
n
{\displaystyle \ x=c_{n}\ }
n
exp
(
−
1
2
x
2
)
2
π
x
≈
exp
(
c
n
(
c
n
−
x
)
)
.
{\displaystyle {\frac {\ n\ \exp \left(-\ {\tfrac {1}{2}}x^{2}\right)\ }{{\sqrt {2\pi \ }}\ x}}\approx \exp \left(\ c_{n}\ (c_{n}-x)\ \right)~.}
As
n
{\displaystyle \ n\ }
increases, this becomes a good approximation for a wider and wider range of
c
n
(
c
n
−
x
)
{\displaystyle \ c_{n}\ (c_{n}-x)\ }
so letting
u
≡
c
n
(
c
n
−
x
)
{\displaystyle \ u\equiv c_{n}\ (c_{n}-x)\ }
we find that
lim
n
→
∞
(
F
(
u
c
n
+
c
n
)
n
)
=
exp
(
−
exp
(
−
u
)
)
=
G
0
(
u
)
.
{\displaystyle \lim _{n\to \infty }{\biggl (}\ F\left({\tfrac {u}{~c_{n}\ }}+c_{n}\right)^{n}\ {\biggr )}=\exp \!{\Bigl (}-\exp(-u){\Bigr )}=G_{0}(u )~.}
Equivalently,
lim
n
→
∞
P
(
max
{
X
1
,
…
,
X
n
}
−
c
n
(
u
c
n
)
≤
u
)
=
exp
(
−
exp
(
−
u
)
)
=
G
0
(
u
)
.
{\displaystyle \lim _{n\to \infty }{\boldsymbol {\mathcal {P}}}\ {\Biggl (}{\frac {\ \max\{X_{1},\ \ldots ,\ X_{n}\}-c_{n}\ }{\left({\frac {u}{~c_{n}\ }}\right)}}\leq u{\Biggr )}=\exp \!{\Bigl (}-\exp(-u){\Bigr )}=G_{0}(u )~.}
With this result, we see retrospectively that we need
ln
c
n
≈
ln
ln
n
2
{\displaystyle \ \ln c_{n}\approx {\frac {\ \ln \ln n\ }{2}}\ }
and then
c
n
≈
2
ln
n
,
{\displaystyle c_{n}\approx {\sqrt {2\ln n\ }}\ ,}
so the maximum is expected to climb toward infinity ever more slowly.
Weibull distribution [ edit ]
We may take the simplest example, a uniform distribution between 0 and 1 , with cumulative distribution function
F
(
x
)
=
x
{\displaystyle F(x )=x\ }
for any x value from 0 to 1 .
For values of
x
→
1
{\displaystyle \ x\ \rightarrow \ 1\ }
we have
ln
(
F
(
x
)
n
)
=
n
ln
F
(
x
)
→
n
(
1
−
x
)
.
{\displaystyle \ln {\Bigl (}\ F(x )^{n}\ {\Bigr )}=n\ \ln F(x )\ \rightarrow \ n\ (\ 1-x\ )~.}
So for
x
≈
1
{\displaystyle \ x\approx 1\ }
we have
F
(
x
)
n
≈
exp
(
n
−
n
x
)
.
{\displaystyle \ F(x )^{n}\approx \exp(\ n-n\ x\ )~.}
Let
u
≡
1
+
n
(
1
−
x
)
{\displaystyle \ u\equiv 1+n\ (\ 1-x\ )\ }
and get
lim
n
→
∞
(
F
(
u
n
+
1
−
1
n
)
)
n
=
exp
(
−
(
1
−
u
)
)
=
G
−
1
(
u
)
.
{\displaystyle \lim _{n\to \infty }{\Bigl (}\ F\!\left({\tfrac {\ u\ }{n}}+1-{\tfrac {\ 1\ }{n}}\right)\ {\Bigr )}^{n}=\exp \!{\bigl (}\ -(1-u)\ {\bigr )}=G_{-1}(u )~.}
Close examination of that limit shows that the expected maximum approaches 1 in inverse proportion to n .
See also [ edit ]
Gumbel distribution
Generalized extreme value distribution
Generalized Pareto distribution
Pickands–Balkema–de Haan theorem
References [ edit ]
^ Fréchet, M. (1927). "Sur la loi de probabilité de l'écart maximum". Annales de la Société Polonaise de Mathématique . 6 (1 ): 93–116.
^ Fisher, R.A.; Tippett, L.H.C. (1928). "Limiting forms of the frequency distribution of the largest and smallest member of a sample". Proc. Camb. Phil. Soc . 24 (2 ): 180–190. Bibcode :1928PCPS...24..180F . doi :10.1017/s0305004100015681 . S2CID 123125823 .
^ a b von Mises, R. (1936). "La distribution de la plus grande de n valeurs" [The distribution of the largest of n values]. Rev. Math. Union Interbalcanique . 1 (in French): 141–160.
^ a b Falk, Michael; Marohn, Frank (1993). "von Mises conditions revisited". The Annals of Probability : 1310–1328.
^ a b c Gnedenko, B.V. (1943). "Sur la distribution limite du terme maximum d'une serie aleatoire". Annals of Mathematics . 44 (3 ): 423–453. doi :10.2307/1968974 . JSTOR 1968974 .
^ Mood, A.M. (1950). "5. Order Statistics". Introduction to the theory of statistics . New York, NY: McGraw-Hill. pp. 251–270.
^ a b Haan, Laurens; Ferreira, Ana (2007). Extreme Value Theory: An introduction . Springer.
Further reading [ edit ]
R e t r i e v e d f r o m " https://en.wikipedia.org/w/index.php?title=Fisher–Tippett–Gnedenko_theorem&oldid=1226112467 "
C a t e g o r i e s :
● T h e o r e m s i n s t a t i s t i c s
● E x t r e m e v a l u e d a t a
● T a i l s o f p r o b a b i l i t y d i s t r i b u t i o n s
H i d d e n c a t e g o r i e s :
● C S 1 F r e n c h - l a n g u a g e s o u r c e s ( fr )
● A r t i c l e s n e e d i n g a d d i t i o n a l r e f e r e n c e s f r o m A p r i l 2 0 2 3
● A l l a r t i c l e s n e e d i n g a d d i t i o n a l r e f e r e n c e s
● T h i s p a g e w a s l a s t e d i t e d o n 2 8 M a y 2 0 2 4 , a t 1 6 : 3 5 ( U T C ) .
● T e x t i s a v a i l a b l e u n d e r t h e C r e a t i v e C o m m o n s A t t r i b u t i o n - S h a r e A l i k e L i c e n s e 4 . 0 ;
a d d i t i o n a l t e r m s m a y a p p l y . B y u s i n g t h i s s i t e , y o u a g r e e t o t h e T e r m s o f U s e a n d P r i v a c y P o l i c y . W i k i p e d i a ® i s a r e g i s t e r e d t r a d e m a r k o f t h e W i k i m e d i a F o u n d a t i o n , I n c . , a n o n - p r o f i t o r g a n i z a t i o n .
● P r i v a c y p o l i c y
● A b o u t W i k i p e d i a
● D i s c l a i m e r s
● C o n t a c t W i k i p e d i a
● C o d e o f C o n d u c t
● D e v e l o p e r s
● S t a t i s t i c s
● C o o k i e s t a t e m e n t
● M o b i l e v i e w