previous  home  search  LaTeX  -  PostScript  -  PDF  -  Html/Gif   contact    up    next  

Prediction with Expert Advice by Following the Perturbed Leader for General Weights


Authors: Marcus Hutter and Jan Poland (2004)
Comments: 16 pages
Subj-class: Learning; Artificial Intelligence
Reference: Proceedings of the 15th International Conference on Algorithmic Learning Theory (ALT 2004) pages 279-293
Report-no: IDSIA-08-04 and cs.LG/0405043
Paper: LaTeX  -  PostScript  -  PDF  -  Html/Gif 
Slides: PostScript - PDF

Keywords: Prediction with Expert Advice, Follow the Perturbed Leader, general weights, adaptive learning rate, hierarchy of experts, expected and high probability bounds, general alphabet and loss, online sequential prediction.

Abstract: When applying aggregating strategies to Prediction with Expert Advice, the learning rate must be adaptively tuned. The natural choice of sqrt(complexity/current loss) renders the analysis of Weighted Majority derivatives quite complicated. In particular, for arbitrary weights there have been no results proven so far. The analysis of the alternative "Follow the Perturbed Leader" (FPL) algorithm from Kalai & Vempala (2003) (based on Hannan's algorithm) is easier. We derive loss bounds for adaptive learning rate and both finite expert classes with uniform weights and countable expert classes with arbitrary weights. For the former setup, our loss bounds match the best known results so far, while for the latter our results are new.

 previous  home  search  LaTeX  -  PostScript  -  PDF  -  Html/Gif   contact    up    next  

BibTeX Entry

@InProceedings{Hutter:04expert,
  author =       "M. Hutter and J. Poland",
  title =        "Prediction with Expert Advice by Following the Perturbed Leader for General Weights",
  booktitle =    "Proc. 15th International Conf. on Algorithmic Learning Theory ({ALT-2004})",
  address =      "Padova",
  series =       "LNAI",
  volume =       "3244",
  editor =       "S. Ben-David and J. Case and A. Maruoka",
  publisher =    "Springer, Berlin",
  pages =        "279--293",
  year =         "2004",
  http =         "http://www.hutter1.net/ai/expert.htm",
  url =          "http://arxiv.org/abs/cs.LG/0405043",
  ftp =          "ftp://ftp.idsia.ch/pub/techrep/IDSIA-08-04.pdf",
  keywords =     "Prediction with Expert Advice, Follow the Perturbed Leader,
                  general weights, adaptive learning rate,
                  hierarchy of experts, expected and high probability bounds,
                  general alphabet and loss, online sequential prediction.",
  abstract =     "When applying aggregating strategies to Prediction with Expert
                  Advice, the learning rate must be adaptively tuned. The natural
                  choice of sqrt(complexity/current loss) renders the
                  analysis of Weighted Majority derivatives quite complicated. In
                  particular, for arbitrary weights there have been no results
                  proven so far. The analysis of the alternative ``Follow the
                  Perturbed Leader'' (FPL) algorithm from Kalai \& Vempala (2003) (based on
                  Hannan's algorithm) is easier. We derive loss bounds for adaptive
                  learning rate and both finite expert classes with uniform weights
                  and countable expert classes with arbitrary weights. For the
                  former setup, our loss bounds match the best known results so far,
                  while for the latter our results are new.",
}
 previous  home  search  LaTeX  -  PostScript  -  PDF  -  Html/Gif   contact    up    next