Exchangeable random variables

In statistics, an exchangeable sequence of random variables (also sometimes interchangeable)[1] is a sequence such that future samples behave like earlier samples, meaning formally that any order (of a finite number of samples) is equally likely. This formalizes the notion of "the future being predictable on the basis of past experience." It is closely related to the use of independent and identically distributed random variables in statistical models. Exchangeable sequences of random variables arise in cases of simple random sampling.

Definition

Formally, an exchangeable sequence of random variables is a finite or infinite sequence X1, X2, X3, ... of random variables such that for any finite permutation σ of the indices 1, 2, 3, ..., (the permutation acts on only finitely many indices, with the rest fixed), the joint probability distribution of the permuted sequence

is the same as the joint probability distribution of the original sequence.[1][2]

(A sequence E1, E2, E3, ... of events is said to be exchangeable precisely if the sequence of its indicator functions is exchangeable.) The distribution function FX1,...,Xn(x1, ..., xn) of a finite sequence of exchangeable random variables is symmetric in its arguments x1, ... ,xn. Olav Kallenberg provided an appropriate definition of exchangeability for continuous-time stochastic processes.[3][4]

History

The concept was introduced by William Ernest Johnson in his 1924 book Logic, Part III: The Logical Foundations of Science.[5] Exchangeability is equivalent to the concept of statistical control introduced by Walter Shewhart also in 1924.[6][7]

Exchangeability and the i.i.d statistical model

The property of exchangeability is closely related to the use of independent and identically-distributed random variables in statistical models. A sequence of random variables that are independent and identically-distributed (i.i.d), conditional on some underlying distributional form is exchangeable. This follows directly from the structure of the joint probability distribution generated by the i.i.d form.

Moreover, the converse can be established for infinite sequences, through a celebrated representation theorem by Bruno de Finetti (later extended by other probability theorists such as Halmos and Savage). The extended versions of the theorem show that in any infinite sequence of exchangeable random variables, the random variables are conditionally independent and identically-distributed, given the underlying distributional form. This theorem is stated briefly below. (De Finetti's original theorem only showed this to be true for random indicator variables, but this was later extended to encompass all sequences of random variables.) Another way of putting this is that de Finetti's theorem characterizes exchangeable sequences as mixtures of i.i.d sequences — while an exchangeable sequence need not itself be unconditionally i.i.d, it can be expressed as a mixture of underlying i.i.d sequences.[1]

This means that infinite sequences of exchangeable random variables can be regarded equivalently as sequences of conditionally i.i.d random variables, based on some underlying distributional form. (Note that this equivalence does not quite hold for finite exchangeability. However, for finite vectors of random variables there is a close approximation to the i.i.d model.) An infinite exchangeable sequence is strictly stationary and so a law of large numbers in the form of Birkhoff-Khinchin theorem applies.[4] This means that the underlying distribution can be given an operational interpretation as the limiting empirical distribution of the sequence of values. The close relationship between exchangeable sequences of random variables and the i.i.d form means that the latter can be justified on the basis of infinite exchangeability. This notion is central to Bruno de Finetti's development of predictive inference and to Bayesian statistics. It can also be shown to be a useful foundational assumption in frequentist statistics and to link the two paradigms.[8]

The Representation Theorem: This statement is based on the presentation in O'Neill (2009) in references below. Given an infinite sequence of random variables we define the limiting empirical distribution function by:

(This is the Cesaro limit of the indicator functions. In cases where the Cesaro limit does not exist this function can actually be defined as the Banach limit of the indicator functions, which is an extension of this limit. This latter limit always exists for sums of indicator functions, so that the empirical distribution is always well-defined.) If the sequence is exchangeable then the elements of are independent with distribution function . This means that for any vector of random variables in the sequence we have joint distribution function given by:

If the distribution function is indexed by another parameter then (with densities appropriately defined) we have:

These equations show the joint distribution or density characterised as a mixture distribution based on the underlying limiting empirical distribution (or a parameter indexing this distribution).

Note that not all finite exchangeable sequences are mixtures of i.i.d. To see this, consider sampling without replacement from a finite set until no elements are left. The resulting sequence is exchangeable, but not a mixture of i.i.d. Indeed conditioned on all other elements in the sequence, the remaining element is known.

Covariance and Correlation

Exchangeable sequences have some basic covariance and correlation properties which mean that they are generally positively correlated. For infinite sequences of exchangeable random variables, the covariance between the random variables is equal to the variance of the mean of the underlying distribution function.[8] For finite exchangeable sequences the covariance is also a fixed value which does not depend on the particular random variables in the sequence. There is a weaker lower bound than for infinite exchangeability and it is possible for negative correlation to exist.


Covariance for exchangeable sequences (infinite): If the sequence is exchangeable then:


Covariance for exchangeable sequences (finite): If is exchangeable with then:

The finite sequence result may be proved as follows. Using the fact that the values are exchangeable we have:

We can then solve the inequality for the covariance yielding the stated lower bound. The non-negativity of the covariance for the infinite sequence can then be obtained as a limiting result from this finite sequence result.


Equality of the lower bound for finite sequences is achieved in a simple urn model: An urn contains 1 red marble and n  1 green marbles, and these are sampled without replacement until the urn is empty. Let Xi = 1 if the red marble is drawn on the ith trial and 0 otherwise. A finite sequence that achieves the lower covariance bound cannot be extended to a longer exchangeable sequence.[9]

Examples

Applications

The von Neumann extractor is a randomness extractor that depends on exchangeability: it gives a method to take an exchangeable sequence of 0s and 1s (Bernoulli trials), with some probability p of 0 and of 1, and produce a (shorter) exchangeable sequence of 0s and 1s with probability 1/2.

Partition the sequence into non-overlapping pairs: if the two elements of the pair are equal (00 or 11), discard it; if the two elements of the pair are unequal (01 or 10), keep the first. This yields a sequence of Bernoulli trials with as, by exchangeability, the odds of a given pair being 01 or 10 are equal.

Exchangeable random variables arise in the study of U statistics, particularly in the Hoeffding decomposition.[11]

See also

Notes

  1. 1 2 3 In short, the order of the sequence of random variables does not affect its joint probability distribution.
    • Chow, Yuan Shih and Teicher, Henry, Probability theory. Independence, interchangeability, martingales, Springer Texts in Statistics, 3rd ed., Springer, New York, 1997. xxii+488 pp. ISBN 0-387-98228-0
  2. Aldous, David J., Exchangeability and related topics, in: École d'Été de Probabilités de Saint-Flour XIII — 1983, Lecture Notes in Math. 1117, pp. 1–198, Springer, Berlin, 1985. ISBN 978-3-540-15203-3 doi:10.1007/BFb0099421
  3. Diaconis, Persi (2009). "Book review: Probabilistic symmetries and invariance principles (Olav Kallenberg, Springer, New York, 2005)". Bulletin of the Amererican Mathematical Society (New Series). 46 (4): 691–696. doi:10.1090/S0273-0979-09-01262-2. MR 2525743.
  4. 1 2 Kallenberg, O., Probabilistic symmetries and invariance principles. Springer-Verlag, New York (2005). 510 pp. ISBN 0-387-25115-4.
  5. Zabell (1992)
  6. Barlow & Irony (1992)
  7. Bergman (2009)
  8. 1 2
    • O'Neill, B. (2009) Exchangeability, Correlation and Bayes' Effect. International Statistical Review 77(2), pp. 241-250.
  9. Taylor, Robert Lee; Daffer, Peter Z.; Patterson, Ronald F. (1985). Limit theorems for sums of exchangeable random variables. Rowman and Allanheld. pp. 1–152.
  10. Spizzichino, Fabio Subjective probability models for lifetimes. Monographs on Statistics and Applied Probability, 91. Chapman & Hall/CRC, Boca Raton, FL, 2001. xx+248 pp. ISBN 1-58488-060-0
  11. Borovskikh, Yu. V. (1996). "Chapter 10 Dependent variables". U-statistics in Banach spaces. Utrecht: VSP. pp. 365–376. ISBN 90-6764-200-2. MR 1419498.

Bibliography

This article is issued from Wikipedia - version of the 11/8/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.