Informed Choices: How to Calibrate ABC with Hypothesis Testing

Authored by: Oliver Ratmann , Anton Camacho , Sen Hu , Caroline Colijn

Handbook of Approximate Bayesian Computation

Print publication date:  August  2018
Online publication date:  August  2018

Print ISBN: 9781439881507
eBook ISBN: 9781315117195
Adobe ISBN:

10.1201/9781315117195-11

 Download Chapter

 

Abstract

Approximate Bayesian computations (ABC) proceed by summarising the data, simulating from the model, comparing simulated summaries to observed summaries with an ABC distance function, and accepting the simulated summaries if they do not differ from the observed summaries by more than a user-defined ABC tolerance parameter. These steps are repeated in many Monte Carlo iterations to obtain an approximation to the true posterior density of the model parameters. The process by which precise ABC tolerances and ABC distance functions can be obtained is often referred to as ‘ABC calibrations’. The calibrations, that we describe here, apply to the binary ABC accept/reject kernel that evaluates to zero or one. They are based on decision theoretic arguments to construct the ABC accept/reject step, so that the ABC approximation to the posterior density enjoys certain desirable properties. This book chapter aims to give an introduction to ABC calibrations based on hypothesis testing and the abc.star R package that implements these for the most commonly occurring scenarios.

 Cite
Search for more...
Back to top

Use of cookies on this website

We are using cookies to provide statistics that help us give you the best experience of our site. You can find out more in our Privacy Policy. By continuing to use the site you are agreeing to our use of cookies.