1 / 28
文档名称:

Bayesian Inductive Inference, Maximum Entropy, & Neutron Scattering.pdf

格式:pdf   页数:28
下载后只包含 1 个 PDF 格式的文档,没有任何的图纸或源代码,查看文件列表

如果您已付费下载过本站文档,您可以点这里二次下载

Bayesian Inductive Inference, Maximum Entropy, & Neutron Scattering.pdf

上传人:bolee65 2014/9/29 文件大小:0 KB

下载得到文件列表

Bayesian Inductive Inference, Maximum Entropy, & Neutron Scattering.pdf

文档介绍

文档介绍:Bayesian Inductive Inference
Maximum Entropy
& Neutron Scattering by Devinder Singh Sivia
he result of this experiment was inconclusive, so we
had to use statistics. Such oft-heard statements reflect
Tthe “cookbook” approach to statistics that we are taught as
undergraduates. Not satisfied with the maze of seemingly
ad hoc statistical tests, many of us e inclined
to avoid the subject as much as possible.
Fortunately, statistics does not have to be like
that! A more logical and unified approach
Bayesian Inductive Inference.
to the whole subject is provided by the probability formulations of Bayes and Laplace
Bayes’ ideas (published in 1763) were used very essfully by Laplace (1812)
but were then allegedly discredited and largely forgotten until they were rediscov-
ered by Jeffreys (1939). In more recent times they have been expounded by Jaynes
and others. Here we present an introductory glimpse of the Bayesian approach. We
then illustrate how Bayesian ideas, and developments such as the maximum entropy
method, are affecting data analysis and thoughts on instrument design at the Manuel
Lujan, Jr. Neutron Scattering Center (LANSCE).
Everyday games of chance are governed by deductive logic. For example, if we
r are told that a fair coin is flipped ten times, we can deduce accurately the chances
that all ten flips produced heads, or that nine produced heads and one produced tails,
DIRECT PROBABILITIES ..., or that all ten flips produced tails (Fig. 1). Turning to neutron scattering, let’s
suppose we know the scattering law for a particular sample and the geometry of
Fig. 1. What is the probability of getting r the diffractometer, the efficiencies of the detectors, and so on. Then we can predict
heads in 10 flips of a fair coin, Pr (r heads 10 the chances of observing a certain number of neutron counts in any given detector.
flips of a fair coin)? Deductive logic tells These examples are in the realm of deductive logic, or pure