Cantor Set Continuous Mapping From Line to a Square

The Continuous Mapping theorem states that stochastic convergence is preserved by continuous functions.

Table of Contents

Table of contents

  1. The problem

  2. The theorem

  3. Consequences

    1. Sums and products of sequences converging in probability

    2. Sums and products of sequences converging almost surely

    3. Sums and products of sequences converging in distribution

  4. More details

    1. Convergence of ratios

    2. Random matrices

  5. Applications

  6. Solved exercises

    1. Exercise 1

  7. References

Suppose that a sequence of random vectors [eq1] converges to a random vector X (in probability, in distribution or almost surely).

Now, take a transformed sequence [eq2] , where $g$ is a function.

Under what conditions is [eq3] also a convergent sequence?

The Continuous Mapping theorem states that stochastic convergence is preserved if $g$ is a continuous function.

Here is a statement of the multivariate version of the Continuous Mapping theorem.

Proof

The next sections present some important consequences of the Continuous Mapping theorem.

Sums and products of sequences converging in probability

An important implication of the Continuous Mapping theorem is that arithmetic operations preserve convergence in probability.

Proposition If [eq10] and [eq11] . Then, [eq12]

Proof

Sums and products of sequences converging almost surely

Everything that was said in the previous subsection applies, with obvious modifications, also to almost surely convergent sequences.

Proposition If [eq18] and [eq19] , then [eq20]

Proof

Similar to previous proof. Just replace convergence in probability with almost sure convergence.

Sums and products of sequences converging in distribution

For convergence almost surely and convergence in probability, the convergence of [eq1] and [eq14] individually implies their joint convergence as a vector (see the previous two proofs), but this is not the case for convergence in distribution. Therefore, to obtain preservation of convergence in distribution under arithmetic operations, we need the stronger assumption of joint convergence in distribution.

Proposition If [eq23] then [eq24]

Proof

Again, similar to the proof for convergence in probability, but this time joint convergence is already in the assumptions.

The following sections contain more details about the Continuous Mapping theorem.

Convergence of ratios

As a byproduct of the propositions stated above, we also have the following proposition.

Proposition If a sequence of random variables [eq1] converges to X , then [eq26] provided X is almost surely different from 0 (we did not specify the kind of convergence, which can be in probability, almost surely or in distribution).

Proof

This is a consequence of the Continuous Mapping theorem and of the fact that [eq27] is a continuous function for $x  eq 0$ .

An immediate consequence of the previous proposition follows.

Proposition If two sequences of random variables [eq1] and [eq29] converge to X and Y respectively, then [eq30] provided Y is almost surely different from 0 . Convergence can be in probability, almost surely or in distribution (but the latter requires joint convergence in distribution of [eq1] and [eq29] ).

Proof

This is a consequence of the fact that the ratio can be written as a product [eq33] The first operand of the product converges by assumption. The second converges because of the previous proposition. Therefore, their product converges because convergence is preserved under products.

Random matrices

The Continuous Mapping theorem applies also to random matrices because random matrices are just random vectors whose entries have been arranged into the columns of a matrix.

In particular:

  • if two sequences of random matrices are convergent, then also the sum and the product of their terms are convergent (provided their dimensions are such that they can be summed or multiplied);

  • if a sequence of square random matrices [eq1] converges to a random matrix X , then the sequence of inverse matrices [eq35] converges to the random matrix $X^{-1}$ (provided the matrices are invertible). This is a consequence of the fact that matrix inversion is a continuous transformation.

The Continuous Mapping theorem has several important applications. For example, it is used to prove:

  • Slutsky's theorem;

  • the Delta method.

Below you can find some exercises with explained solutions.

Exercise 1

Consider a sequence [eq1] of random variables converging in distribution to a random variable X having a standard normal distribution.

Consider the function [eq37] which is a continuous function.

Find the limit in distribution of the sequence [eq38] .

Solution

The sequence [eq39] converges in distribution to $X^{2}$ by the Continuous Mapping theorem. But the square of a standard normal random variable has a Chi-square distribution with one degree of freedom. Therefore, the sequence [eq40] converges in distribution to a Chi-square distribution with one degree of freedom.

Shao, J. (2007) Mathematical statistics, Springer.

Please cite as:

Taboga, Marco (2021). "Continuous Mapping theorem", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/asymptotic-theory/continuous-mapping-theorem.

colbertpariz1975.blogspot.com

Source: https://www.statlect.com/asymptotic-theory/continuous-mapping-theorem

0 Response to "Cantor Set Continuous Mapping From Line to a Square"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel