Rademacher complexity

In computational learning theory (machine learning and theory of computation), Rademacher complexity, named after Hans Rademacher, measures richness of a class of real-valued functions with respect to a probability distribution.

Given a training sample , and a class of real-valued functions defined on a domain space , the empirical Rademacher complexity of is defined as:

where are independent random variables drawn from the Rademacher distribution i.e. for .

Let be a probability distribution over . The Rademacher complexity of the function class with respect to for sample size is:

where the above expectation is taken over an identically independently distributed (i.i.d.) sample generated according to .

One can show, for example, that there exists a constant , such that any class of -indicator functions with Vapnik-Chervonenkis dimension has Rademacher complexity upper-bounded by .

Gaussian complexity

Gaussian complexity is a similar complexity with similar physical meanings, and can be obtained from the previous complexity using the random variables instead of , where are Gaussian i.i.d. random variables with zero-mean and variance 1, i.e. .

References

This article is issued from Wikipedia - version of the 10/24/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.