convergence in probability
Let be a sequence of random variables defined on a probability space
taking values in a separable
metricspace , where is the metric. Then we say the sequence converges in probability or converges in measure to a random variable iffor every ,
We denote convergence in probability of to by
Equivalently, iff every subsequence of contains a subsequence which converges to almost surely.
Remarks.
- •
Unlike ordinary convergence, and only implies that almost surely.
- •
The need for separability on is to ensure that the metric, , is a random variable, for all random variables and .
- •
Convergence almost surely implies convergence in probability but not conversely.
References
- 1 R. M. Dudley, Real Analysis and Probability, Cambridge University Press (2002).
- 2 W. Feller, An Introduction to Probability Theory and Its Applications. Vol. 1, Wiley, 3rd ed. (1968).