On Generalized Computable Universal Priors and their Convergence
Keywords: Sequence prediction; Algorithmic Information Theory;
Solomonoff's prior; universal probability;
mixture distributions; posterior convergence;
computability concepts; Martin-Löf randomness.
Abstract: Solomonoff unified Occam's razor and Epicurus' principle of
multiple explanations to one elegant, formal, universal theory of
inductive inference, which initiated the field of algorithmic
information theory. His central result is that the posterior of
the universal semimeasure M converges rapidly to the true sequence
generating posterior mu, if the latter is computable. Hence, M is
eligible as a universal predictor in case of unknown mu. The first
part of the paper investigates the existence and convergence of
computable universal (semi)measures for a hierarchy of
computability classes: recursive, estimable, enumerable, and
approximable. For instance, M is known to be enumerable, but
not estimable, and to dominate all enumerable semimeasures. We
present proofs for discrete and continuous semimeasures. The
second part investigates more closely the types of convergence,
possibly implied by universality: in difference and in ratio, with
probability 1, in mean sum, and for Martin-Loef random sequences.
We introduce a generalized concept of randomness for individual
sequences and use it to exhibit difficulties regarding these
issues. In particular, we show that convergence fails (holds) on
generalized-random sequences in gappy (dense) Bernoulli classes.
BibTeX Entry
@Article{Hutter:05unipriorx,
author = "M. Hutter",
number = "IDSIA-05-05",
title = "On Generalized Computable Universal Priors and their Convergence",
year = "2005",
url = "http://www.hutter1.net/ai/unipriorx.htm",
http = "http://arxiv.org/abs/cs.LG/0503026",
ftp = "ftp://ftp.idsia.ch/pub/techrep/IDSIA-05-05.pdf",
keywords = "Sequence prediction; Algorithmic Information Theory;
Solomonoff's prior; universal probability;
mixture distributions; posterior convergence;
computability concepts; Martin-Loef randomness.",
abstract = "Solomonoff unified Occam's razor and Epicurus' principle of
multiple explanations to one elegant, formal, universal theory of
inductive inference, which initiated the field of algorithmic
information theory. His central result is that the posterior of
the universal semimeasure M converges rapidly to the true sequence
generating posterior mu, if the latter is computable. Hence, M is
eligible as a universal predictor in case of unknown mu. The first
part of the paper investigates the existence and convergence of
computable universal (semi)measures for a hierarchy of
computability classes: recursive, estimable, enumerable, and
approximable. For instance, M is known to be enumerable, but
not estimable, and to dominate all enumerable semimeasures. We
present proofs for discrete and continuous semimeasures. The
second part investigates more closely the types of convergence,
possibly implied by universality: in difference and in ratio, with
probability 1, in mean sum, and for Martin-Loef random sequences.
We introduce a generalized concept of randomness for individual
sequences and use it to exhibit difficulties regarding these
issues. In particular, we show that convergence fails (holds) on
generalized-random sequences in gappy (dense) Bernoulli classes.",
}