文档介绍:OPTIMIZATION
METHODS
FOR
APPLICATIONS
IN
STATISTICS
James E. Gentle
e Mason University
c 2004 by James E. Gentle. All Rights Reserved.
Preface
Optimization of functions, that is, minimization or maximization, is ubiquitous
in statistical methods. Many methods of inference are built on the principle
of maximum likelihood, in which an assumed probability density function or
a probability function is optimized with respect to its parameters, given the
observed realizations of the random variable whose distribution is described by
the function. Other methods of inference involve fitting of a model to observed
data in such a way that the deviations of the observations from the model
are a minimum. Many important methods of statistics are used before any
inferences are made. In fact, ideally, statistical methods are considered before
data are collected, and the sampling method or the design of the experiment
is determined so as to maximize the value of the data for making inferences,
usually by minimizing the variance of estimators.
The first three chapters of this book are introductory. Chapter 1 -
mon statistical techniques as optimization problems. Chapter 2, puter
arithmetic, is somewhat detailed, and perhaps can be skimmed so long as its
main point is understood and remembered: computer arithmetic is different
from arithmetic on real numbers. Chapter 3 provides some basic definitions
and discusses some important properties of continuous functions that are used
in subsequent chapters.
Chapters 4 and 5 continue with the focus on continuous functions, generally
twice-differentiable functions. These functions occur mon optimization
problems, and the ideas underlying the methods that address optimization of
these functions have more general applicability. Because optimization of dif-
ferentiable functions generally involves the solution of a system of equations,
Chapter 4 covers basic methods for solving a system of equations, or fin