Outperforming the Gibbs sampler empirical estimator for nearest-neighbor random fields

Priscilla E. Greenwood, Ian W. McKeague, Wolfgang Wefelmeyer

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

7 Citations (Scopus)

Abstract

Given a Markov chain sampling scheme, does the standard empirical estimator make best use of the data? We show that this is not so and construct better estimators. We restrict attention to nearest-neighbor random fields and to Gibbs samplers with deterministic sweep, but our approach applies to any sampler that uses reversible variable-at-a-time updating with deterministic sweep. The structure of the transition distribution of the sampler is exploited to construct further empirical estimators that are combined with the standard empirical estimator to reduce asymptotic variance. The extra computational cost is negligible. When the random field is spatially homogeneous, symmetrizations of our estimator lead to further variance reduction. The performance of the estimators is evaluated in a simulation study of the Ising model.
Original languageEnglish
Pages (from-to)1433-1456
JournalAnnals of Statistics
Volume24
Issue number4
DOIs
Publication statusPublished - Aug 1996
Externally publishedYes

Bibliographical note

Publication details (e.g. title, author(s), publication statuses and dates) are captured on an “AS IS” and “AS AVAILABLE” basis at the time of record harvesting from the data source. Suggestions for further amendments or supplementary information can be sent to [email protected].

Research Keywords

  • Asymptotic relative efficiency
  • Ising model
  • Markov chain Monte Carlo
  • Metropolis-Hastings algorithm
  • Parallel updating
  • Variance reduction

Fingerprint

Dive into the research topics of 'Outperforming the Gibbs sampler empirical estimator for nearest-neighbor random fields'. Together they form a unique fingerprint.

Cite this