|Abstract: ||Two major approaches have developed within Bayesian statistics to address uncertainty in the prior distribution and in the overall model more generally. First, methods of model checking, including those assessing prior-data conflict, determine whether the prior and the rest of the model are adequate for purposes of inference and estimation or other decision-making. The main drawback of this approach is that it provides little guidance for inference in the event that the model is found to be inadequate, that is, in conflict with the data. Second, the robust Bayes approach determines the sensitivity of inferences and decisions to the prior distribution and other model assumptions. This approach includes rules for making decisions on the basis of a set of posterior distributions corresponding to the set of reasonable model assumptions. Drawbacks of the second approach include the inability to criticize the set of models and the lack of guidance for specifying such a set.
Those two approaches to model uncertainty are combined in order to overcome the limitations of each approach. The first approach checks each model within a large class of models to assess which models are in conflict with the data and which models are adequate for purposes of data analysis. The resulting set of adequate models is then used for inference according to decision rules developed for the robust Bayes approach and for imprecise probability more generally. This proposed framework is illustrated by the application of a class of hierarchical models to a simple data set.|