Recently I read a proof in [1] of the main theorem of extreme value statistics: the Fisher–Tippett–Gnedenko theorem. In this post I give an outline.

Here we are interested in the maximum of many independent and identically distributed random variables with distribution function . Let which may be infinite. Then as ,

tends to 0 if and 1 if .

Therefore converges in probability to as .

In order to avoid this degenerate limiting distribution for all extreme value distributions, it is necessary to normalise the distribution.

To this end, suppose there exist real numbers such that

approaches a non-degenerate limiting distribution.

In other words, there exists a distribution function such that

Taking logarithms of both sides, this is equivalent to

This requires as . Using for close to 1, the above is also equivalent to

Next we use the following definition.

A non-decreasing function has

left-continuous inversedefined by

One can use this definition to prove

**Lemma 1**: If for non-decreasing functions then for each continuity point we have .

Next we claim that (1) is equivalent to

where is the left-continuous inverse of . To see this, let . Then by the definition of , . Then for any ,

and so

By (1) and the lemma as this tends to

proving the claim. We can also write

where and denotes the integer part of .

We are now ready to prove the main theorem of extreme value theory.

Theorem(Fisher, Tippet, Gnedenko):where the right side is equal to its limiting value if .

**Proof**:

This will involve numerous substitutions but the main idea is to arrive at a differential equation that can be solved to obtain the above. Suppose is a continuity point of . Then for any continuity point ,

We can write

The claim is that both and have limits as . If they had more than one limit point, say then for (4) in the limit gives us

.

Subtracting gives which implies as we know is non-constant (since we seek a non-degenerate solution).

We conclude that

This is a functional equation that we wish to solve. We let to obtain

which using implies

Since is monotone (following from the monotonicity of ), it is differentiable at some . By (6) it is differentiable at all . Indeed from (6) we obtain

Let . Then .

From (6) and (7),

Similarly, and upon subtraction from (8) we obtain from which

Taking the limit as and using gives the following differential equation for .

To solve (9), differentiate both sides with respect to : from we see that

.

Hence and since , .

Recalling , this leads to .

Recalling this means

Taking the left-continuous inverse of both sides,

Since , .

Hence

In other words,

where .

If 1 is not a continuity point, we repeat the above proof with replaced by where is a continuity point. This completes the proof.

This generalised extreme value distribution encaptures three distributions depending on the nature of the tail of the original distribution :

- Type I – :
**Gumbel**(double exponential) distribution (exponential tail – e.g. normal or exponential distribution) - Type II – :
**Fréchet**distribution (polynomial tail – e.g. power law distribution to model extreme flood levels or high incomes) - Type III – :
**reverse-Weibull**distribution (finite maximum – e.g. uniform distribution)

Sample density functions are plotted below for specific values of .

#### Reference

[1] L. De Haan, A. Ferreira, *Extreme Value Theory: An Introduction*, Springer, 2006.

[…] Outline proof of the extreme value theorem in statistics – Chaitanya’s Random Pages https://ckrao.wordpress.com/2012/06/10/outline-proof-of-the-extreme-value-theorem-in-statistics/ […]

Pingback by Extreme Value Theory | Something Practical: Expanding My Own World in My Own Way — June 17, 2015 @ 4:33 am |