文档介绍:第十讲概率图模型导论 Chapter 10 Introduction to Probabilistic Graphical ModelsWeike Pan, andCongfu Xu{panweike, xucongfu}***@ of Artificial Intelligence College puter Science, Zhejiang UniversityOctober 12, 2006浙江大学计算机学院《人工智能引论》课件References?An Introduction to Probabilistic Graphical Models. Michael I. Jordan. ?/~jordan/?Preparations?Probabilistic Graphical Models (PGM)?Directed PGM?Undirected PGM?Insights of PGMOutline?Preparations?PGM “is” a universal model?Different thoughts of machine learning?Different training approaches?Different data types?Bayesian Framework?Chain rules of probability theory?Conditional Independence?Probabilistic Graphical Models (PGM)?Directed PGM?Undirected PGM?Insights of PGMDifferent thoughts of machine learning?Statistics (modeling uncertainty, detailed information) (plexity, high level information)?Unifying Logical and Statistical AI. Pedro Domingos, University of Washington. AAAI 2006.?Speech: Statistical information (Acoustic model + Language model + Affect model…) + High level information (Expert/Logics)Different training approaches?Maximum Likelihood Training: MAP (Maximum a Posteriori) Training: Maximum Margin (SVM)?Speech: bination –Maximum Likelihood + Discriminative TrainingDifferent data types?Directed acyclic graph (works, BN)?Modeling asymmetric effects and dependencies: causal/temporaldependence (. speech analysis, DNA sequence analysis…)?Undirected graph (Markov Random Fields, MRF)?Modeling symmetric effects and dependencies: spatialdependence (. image analysis…)PGM “is” a universal model?To model both temporal and spatial data, by unifying?Thoughts: Statistics + Logics?Approaches: Maximum Likelihood Training + Discriminative Training ?Further more, the directed and undirected models together provide modeling power beyond that which could be provided by either Framework( | ) ( )( | )( )i iiP O c P cP c OP O?What we care is the co