文档介绍:Friday, August 23, 2002
Haipeng Guo
KDD Research Group
Department puting and Information Sciences
Kansas State University
Dynamic works
KDD Group Seminar
Presentation Outline
Introduction to State-space Models
Dynamic works(DBN)
Representation
Inference
Learning
Summary
Reference
The Problem of Modeling Sequential Data
Sequential Data Modeling is important in many areas
Time series generated by a dynamic system
Time series modeling
A sequence generated by an one-dimensional spatial process
Bio-sequences
The Solutions
Classic approaches to time-series prediction
Linear models: ARIMA(auto-regressive integrated moving average), ARMAX(autoregressive moving average exogenous variables model)
Nonlinear models: works, decision trees
Problems with classic approaches
prediction of the future is based on only a finite window
it’s difficult to incorporate prior knowledge 
difficulties with multi-dimentional inputs and/or outputs
State-space models
Assume there is some underlying hidden state of the world(query) that generates the observations(evidence), and that this hidden state evolves in time, possibly as a function of our inputs
The belief state: our belief on the hidden state of the world given the observations up to the current time y1:t and our inputs u1:t to the system, P( X | y1:t, u1:t )
Two mon state-space models: Hidden Markov Models(HMMs) and Kalman Filter Models(KFMs)
a more general state-space model: dynamic works(DBNs)
State-space Models: Representation
Any state-space model must define a prior P(X1) and a state-transition function, P(Xt | Xt-1) , and an observation function, P(Yt | Xt).
Assumptions:
Models are first-order Markov, ., P(Xt | X1:t-1) = P(Xt | Xt-1)
observations are conditional first-order Markov P(Yt | Xt , Yt-1) = P(Yt | Xt)
Time-invariant or homogeneous
Representations:
HMMs: Xt is a discrete random variables
KFMs: Xt is a vector of continuous random variables
DBNs: more general and expressive language for repres