Bayesian revision of a prior given prior-data conflict, expert opinion, or a similar insight: A large-deviation approach

Description
Title: Bayesian revision of a prior given prior-data conflict, expert opinion, or a similar insight: A large-deviation approach
Authors: Bickel, David R.
Date: 2015-12-31
Abstract: Learning from model diagnostics that a prior distribution must be replaced by one that conflicts less with the data raises the question of which prior should be used for inference and decision. The same problem arises when a decision maker learns that one or more reliable experts express unexpected beliefs. In both cases, coherence of the solution would be guaranteed by formally stating the problem in terms suitable to using Bayes's theorem to condition on the insight that the prior distribution lies in a closed convex set potentially differing from that of the initial prior. A readily available distribution of priors needed for such conditioning in the finite-sample setting is the law of the joint empirical distribution of infinitely many independent parameter values drawn from the initial prior. The solution is the prior distribution that minimizes the entropy relative to the initial prior according to the Gibbs conditioning principle from the theory of large deviations. While minimizing relative entropy accommodates the necessity of going beyond the initial prior without departing from it any more than the insight demands, its derivation from the Gibbs conditioning principle ensures the advantages of Bayesian coherence. This approach is generalized to uncertain constraints by allowing the closed convex set of priors to be random. The distribution of that constraint set may in some cases arise from a confidence distribution such as the one that controls the probability of observing misleading statistical evidence.
URL: http://davidbickel.com
http://hdl.handle.net/10393/34089
CollectionMathématiques et statistiques // Mathematics and Statistics
Files
entropy-draft.pdf472.72 kBAdobe PDFOpen