Hi,
I have a little question related with the definition of convergence in probability. I don't know if this is the most suitable place to publish it because, in fact, it is not about statistics, but I would be very greatful if someone can answer the question I propose:
The fact is: convergence in probability is defined as follows:
Let Xn be a sequence of random variables. We say the sequence converges to X if, for any Epsilon>0, as n goes to infinity,
P(|Xn-X|>Epsilon) -> 0.
Is it right, isn't it??
Ok. Can anyone give me an example where Xn converges in probability to X and X is not constant with probability one? Furthermore, any reference books where this kind of subjects are studied in depth?
Thank you very much.
Ramon.
|