The Formal Definition of Reference Priors Under a General Class of Divergence
Author | : Tri Minh Le |
Publisher | : |
Total Pages | : |
Release | : 2014 |
ISBN-10 | : OCLC:905659409 |
ISBN-13 | : |
Rating | : 4/5 (09 Downloads) |
Download or read book The Formal Definition of Reference Priors Under a General Class of Divergence written by Tri Minh Le and published by . This book was released on 2014 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: Bayesian analysis is widely used recently in both theory and application of statistics. The choice of priors plays a key role in any Bayesian analysis. There are two types of priors: subjective priors and objective priors. In practice, however, the difficulties of subjective elicitation and time restrictions frequently limit us to use the objective priors constructed by some formal rules. In this dissertation, our methodology is using reference analysis to derive objective priors. Objective Bayesian inference makes inference depending only on the assumed model and the available data. The prior distribution used to make an inference is least informative in a certain information-theoretic sense. Recently, Berger, Bernardo and Sun (2009) derived reference priors rigorously in the contexts under Kullback-Leibler divergence. In special cases with common support and other regularity conditions, Ghosh, Mergel and Liu (2011) derived a general f-divergence criterion for prior selection. We generalize Ghosh, Mergel and Liu's (2011) results to the case without common support and show how an explicit expression for the reference prior can be obtained under posterior consistency. The explicit expression can be used to derive new reference priors both analytically and numerically.