previous home search |
LaTeX -
PostScript -
PDF -
Html/Gif
| contact up next |

## Universal Convergence of Semimeasures on Individual Random Sequences

Authors:Marcus Hutter and An. A. Muchnik (2004) Comments:16 pages Subj-class:Learning; Artificial Intelligence; Computational Complexity; Information Theory Reference:Proceedings of the 15th International Conference on Algorithmic Learning Theory (ALT 2004) pages 234-248 Report-no:IDSIA-14-04 and cs.LG/0407057 Paper:LaTeX - PostScript - PDF - Html/Gif Slides:PostScript - PDF

Keywords:Sequence prediction; Algorithmic Information Theory; universal enumerable semimeasure; mixture distributions; posterior convergence; Martin-Loef randomness; quasimeasures.

Abstract:Solomonoff's central result on induction is that the posterior of a universal semimeasure M converges rapidly and with probability 1 to the true sequence generating posterior mu, if the latter is computable. Hence, M is eligible as a universal sequence predictor in case of unknown mu. Despite some nearby results and proofs in the literature, the stronger result of convergence for all (Martin-Loef) random sequences remained open. Such a convergence result would be particularly interesting and natural, since randomness can be defined in terms of M itself. We show that there are universal semimeasures M which do not converge for all random sequences, i.e. we give a partial negative answer to the open problem. We also provide a positive answer for some non-universal semimeasures. We define the incomputable measure D as a mixture over all computable measures and the enumerable semimeasure W as a mixture over all enumerable nearly-measures. We show that W converges to D and D to mu on all random sequences. The Hellinger distance measuring closeness of two distributions plays a central role.

previous home search |
LaTeX -
PostScript -
PDF -
Html/Gif
| contact up next |

@InProceedings{Hutter:04mlconvx, author = "M. Hutter and An. A. Muchnik", title = "Universal Convergence of Semimeasures on Individual Random Sequences", booktitle = "Proc. 15th International Conf. on Algorithmic Learning Theory ({ALT-2004})", address = "Padova", series = "LNAI", volume = "3244", editor = "S. Ben-David and J. Case and A. Maruoka", publisher = "Springer, Berlin", pages = "234--248", year = "2004", http = "http://www.hutter1.net/ai/mlconvx.htm", url = "http://arxiv.org/abs/cs.LG/0407057", ftp = "ftp://ftp.idsia.ch/pub/techrep/IDSIA-14-04.pdf", keywords = "Sequence prediction; Algorithmic Information Theory; universal enumerable semimeasure; mixture distributions; posterior convergence; Martin-L{\"o}f randomness; quasimeasures.", abstract = "Solomonoff's central result on induction is that the posterior of a universal semimeasure M converges rapidly and with probability 1 to the true sequence generating posterior mu, if the latter is computable. Hence, M is eligible as a universal sequence predictor in case of unknown mu. Despite some nearby results and proofs in the literature, the stronger result of convergence for all (Martin-Loef) random sequences remained open. Such a convergence result would be particularly interesting and natural, since randomness can be defined in terms of M itself. We show that there are universal semimeasures M which do not converge for all random sequences, i.e. we give a partial negative answer to the open problem. We also provide a positive answer for some non-universal semimeasures. We define the incomputable measure D as a mixture over all computable measures and the enumerable semimeasure W as a mixture over all enumerable nearly-measures. We show that W converges to D and D to mu on all random sequences. The Hellinger distance measuring closeness of two distributions plays a central role.", }

previous home search |
LaTeX -
PostScript -
PDF -
Html/Gif
| contact up next |