Imprecise probabilities for representing ignorance about a parameter
详细信息查看全文 | 推荐本文 |
摘要
This paper distinguishes between objective probability鈥攐r chance鈥攁nd subjective probability. Most statistical methods in machine learning are based on the hypothesis that there is a random experiment from which we get a set of observations. This random experiment could be identified with a chance or objective probability, but these probabilities depend on some unknown parameters. Our knowledge of these parameters is not objective and in order to learn about them, we must assess some epistemic probabilities about their values. In some cases, our objective knowledge about these parameters is vacuous, so the question is: What epistemic probabilities should be assumed? In this paper we argue for the assumption of non-vacuous (a proper subset of [0, 1]) interval probabilities. There are several reasons for this; some are based on the betting interpretation of epistemic probabilities while others are based on the learning capabilities under the vacuous representation. The implications of the selection of epistemic probabilities in different concepts as conditioning and learning are studied. It is shown that in order to maintain some reasonable learning capabilities we have to assume more informative prior models than those frequently used in the literature, such as the imprecise Dirichlet model.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700