qq5.3.1版本手机涂鸦板在哪里看手机版本

软件类别&&&&软件名称
更新时间&&&&&软件大小&&&&&软件等级
软件按字母排列:
中文按声母排列:查看: 3957|回复: 22
阅读权限28威望0 级论坛币9267 个学术水平24 点热心指数34 点信用等级16 点经验19274 点帖子435精华0在线时间264 小时注册时间最后登录
积分 966, 距离下一级还需 409 积分
权限: 自定义头衔, 签名中使用图片, 隐身
道具: 彩虹炫, 雷达卡, 热点灯, 雷鸣之声, 涂鸦板, 金钱卡, 显身卡, 匿名卡, 抢沙发下一级可获得
权限: 设置帖子权限道具: 提升卡
购买后可立即获得
权限: 隐身
道具: 金钱卡, 雷鸣之声, 彩虹炫, 雷达卡, 涂鸦板, 热点灯
开心签到天数: 3 天连续签到: 1 天[LV.2]偶尔看看I
本帖最后由 kxjs2007 于
15:57 编辑
Finite Mixture and Markov Switching Models (Springer Series in Statistics) (Hardcover)
Sylvia Frühwirth-Schnatter (Author)
Beschreibung von buecher.de
The prominence of finite mixture modelling is greater than ever. Many important statistical topics like clustering data, outlier treatment, or dealing with unobserved heterogeneity involve finite mixture models in some way or other. The area of potential applications goes beyond simple data analysis and extends to regression analysis and to non-linear time series analysis using Markov switching models.
For more than the hundred years since Karl Pearson showed in 1894 how to estimate the five parameters of a mixture of two normal distributions using the method of moments, statistical inference for finite mixture models has been a challenge to everybody who deals with them. In the past ten years, very powerful computational tools emerged for dealing with these models which combine a Bayesian approach with recent Monte simulation techniques based on Markov chains. This book reviews these techniques and covers the most recent advances in the field, among them bridge sampling techniques and reversible jump Markov chain Monte Carlo methods.
It is the first time that the Bayesian perspective of finite mixture modelling is systematically presented in book form. It is argued that the Bayesian approach provides much insight in this context and is easily implemented in practice. Although the main focus is on Bayesian inference, the author reviews several frequentist techniques, especially selecting the number of components of a finite mixture model, and discusses some of their shortcomings compared to the Bayesian approach.
The aim of this book is to impart the finite mixture and Markov switching approach to statistical modelling to a wide-ranging community. This includes not only statisticians, but also biologists, economists, engineers, financial agents, market researcher, medical researchers or any other frequent user of statistical models. This book should help newcomers to the field to understand how finite mixture and Markov switching models are formulated, what structures they imply on the data, what they could be used for, and how they are estimated. Researchers familiar with the subject also will profit from reading this book. The presentation is rather informal without abandoning mathematical correctness. Previous notions of Bayesian inference and Monte Carlo simulation are useful but not needed.
Editorial Reviews
From the reviews:
&At first glance, the numerous equations and formulas may seem to be daunting for psychologists with limited st however, the descriptions and explanations of the various models are actually quite reader friendly (more so than many advanced statistical textbooks). The author has done an excellent job of inviting newcomers to enter the world of mixture models, more impressively, the author did so without sacrificing mathematical and statistical rigor. Mixture models are appealing in many applications in social and psychological studies. This book not only offers a gentle introduction to mixture models but also provides more in depth coverage for those who look beyond the surface. I believe that psychologists who are interested in related models (e.g., latent class models, latent Markov models, and latent class regression models) will benefit greatly from this book. I highly recommend this book to all psychologists who are interested in mixture models.& (Hsiu-Ting Yu, PSYCHOMETRIKA—VOL. 74, NO. 3, 559–560 SEPTEMBER 2009)
&The book is impressive in its mathematical and formal correctness, in generality and in details....it would be helfpful as an additional reference among a wider range of available textbooks in the area. [I]t will find many friends among experts and newcomers to the world of mixture models.& (Atanu Biswas, Biometrics, Issue 63, September 2007)
&Finite mixture distributions are important for many models. Therefore they constitute a very active field of research. This book gives an up to date overview over the various models of this kind. … The aim of this book is to impart the finite mixture and Markov switching approach to statistical modeling to a wide-ranging community. … For the frequentists, it offers a good opportunity to explore the advantages of the Bayesian approach in the context of mixing models.& (Gheorghe Pitis, Zentralblatt MATH, Vol. 1108 (10), 2007)
&Readership: Statisticians, biologists, economists, engineers, financial agents, market researchers, medical researchers or any other frequent user of statistical models. The first nine chapters of the book are concerned with static mixture models, and the last four with Markov switching models. … especially valuable for students, serving to demonstrate how different statistical techniques, which superficially appear to be unrelated, are in fact part of an integrated whole. This book struck me as being particularly clearly written – it is a pleasure to read.& (David J. Hand, International Statistical Review, Vol. 75 (2), 2007)
&The book is excellent, giving a most readable overview of the topic of finite mixtures, aimed at a broad readership … . Students will like the text because of the pedag researchers will definitely welcome the broad treatment of the subject. Both will benefit from the extensive and up-to-date bibliography … as well as the well-organized index. No doubt, this book is a valuable addition to the field of statistics and will surely find its rightful place in many a statistician’s library.& (Valerie Chavez-Demoulin, Journal of the American Statistical Association, Vol. 104 (485), March, 2009)
Product Description
WINNER OF THE 2007 DEGROOT PRIZE!
The prominence of finite mixture modelling is greater than ever. Many important statistical topics like clustering data, outlier treatment, or dealing with unobserved heterogeneity involve finite mixture models in some way or other. The area of potential applications goes beyond simple data analysis and extends to regression analysis and to non-linear time series analysis using Markov switching models.
For more than the hundred years since Karl Pearson showed in 1894 how to estimate the five parameters of a mixture of two normal distributions using the method of moments, statistical inference for finite mixture models has been a challenge to everybody who deals with them. In the past ten years, very powerful computational tools emerged for dealing with these models which combine a Bayesian approach with recent Monte simulation techniques based on Markov chains. This book reviews these techniques and covers the most recent advances in the field, among them bridge sampling techniques and reversible jump Markov chain Monte Carlo methods.
It is the first time that the Bayesian perspective of finite mixture modelling is systematically presented in book form. It is argued that the Bayesian approach provides much insight in this context and is easily implemented in practice. Although the main focus is on Bayesian inference, the author reviews several frequentist techniques, especially selecting the number of components of a finite mixture model, and discusses some of their shortcomings compared to the Bayesian approach.
The aim of this book is to impart the finite mixture and Markov switching approach to statistical modelling to a wide-ranging community. This includes not only statisticians, but also biologists, economists, engineers, financial agents, market researcher, medical researchers or any other frequent user of statistical models. This book should help newcomers to the field to understand how finite mixture and Markov switching models are formulated, what structures they imply on the data, what they could be used for, and how they are estimated. Researchers familiar with the subject also will profit from reading this book. The presentation is rather informal without abandoning mathematical correctness. Previous notions of Bayesian inference and Monte Carlo simulation are useful but not needed.
Product Details Hardcover: 492 pagesPublisher: S 1 edition (August 8, 2006)Language: EnglishISBN-10: ISBN-13: 978-
1 Finite Mixture Modeling 1
2 Statistical Inference for a Finite Mixture Model with Known Number of Components 25
3 Practical Bayesian Inference for a Finite Mixture Model with Known Number of Components 57
4 Statistical Inference for Finite Mixture Models Under Model Specification Uncertainty 99
5 Computational Tools for Bayesian Inference for Finite Mixtures Models Under Model Specification Uncertainty 125
6 Finite Mixture Models with Normal Components 169
7 Data Analysis Based on Finite Mixtures 203
8 Finite Mixtures of Regression Models 241
9 Finite Mixture Models with Nonnormal Components 277
10 Finite Markov Mixture Modeling 301
11 Statistical Inference for Markov Switching Models 319
12 Nonlinear Time Series Analysis Based on Markov Switching Models 357
13 Switching State Space Models 389
载入中......
(493.71 KB)
15:28:03 上传
Finite Mixture and Markov Switching Models
22:12:27 上传
22:12:27 上传
A manual describing the use of this package contained V1.0
22:12:27 上传
A manual describing the use of this package contained V2.0
22:12:27 上传
15:23:37 上传
售价: 1 个论坛币
总评分:&学术水平 + 1&
热心指数 + 1&
信用等级 + 1&
本帖被以下文库推荐
& |主题: 31, 订阅: 5
为了幸福,努力!
阅读权限100威望1 级论坛币200664 个学术水平109 点热心指数180 点信用等级72 点经验1521 点帖子15898精华0在线时间3028 小时注册时间最后登录
积分 19584, 距离下一级还需 12016 积分
权限: 自定义头衔, 签名中使用图片, 设置帖子权限, 隐身, 设置回复可见, 签名中使用代码
道具: 彩虹炫, 雷达卡, 热点灯, 雷鸣之声, 涂鸦板, 金钱卡, 显身卡, 匿名卡, 抢沙发, 提升卡, 沉默卡, 千斤顶, 变色卡, 置顶卡
购买后可立即获得
权限: 隐身
道具: 金钱卡, 雷鸣之声, 彩虹炫, 雷达卡, 涂鸦板, 热点灯
TA的文库&&
开心签到天数: 601 天连续签到: 1 天[LV.9]以坛为家II
非常感谢奋斗
阅读权限28威望0 级论坛币9267 个学术水平24 点热心指数34 点信用等级16 点经验19274 点帖子435精华0在线时间264 小时注册时间最后登录
积分 966, 距离下一级还需 409 积分
权限: 自定义头衔, 签名中使用图片, 隐身
道具: 彩虹炫, 雷达卡, 热点灯, 雷鸣之声, 涂鸦板, 金钱卡, 显身卡, 匿名卡, 抢沙发下一级可获得
权限: 设置帖子权限道具: 提升卡
购买后可立即获得
权限: 隐身
道具: 金钱卡, 雷鸣之声, 彩虹炫, 雷达卡, 涂鸦板, 热点灯
开心签到天数: 3 天连续签到: 1 天[LV.2]偶尔看看I
本帖最后由 kxjs2007 于
15:53 编辑
Contents1 Finite Mixture Modeling 1<font color="#.1 Introduction 1<font color="#.2 Finite Mixture Distributions 3<font color="#.2.1 Basic Definitions 3<font color="#.2.2 Some Descriptive Features of Finite Mixture Distributions 5<font color="#.2.3 Diagnosing Similarity of Mixture Components 9<font color="#.2.4 Moments of a Finite Mixture Distribution 10<font color="#.2.5 Statistical Modeling Based on Finite Mixture Distributions 11<font color="#.3 Identifiability of a Finite Mixture Distribution 14<font color="#.3.1 Nonidentifiability Due to Invariance to Relabeling the Components 15<font color="#.3.2 Nonidentifiability Due to Potential Overfitting 17<font color="#.3.3 Formal Identifiability Constraints 19<font color="#.3.4 Generic Identifiability 212 Statistical Inference for a Finite Mixture Model with
Known Number of Components 25<font color="#.1 Introduction 25<font color="#.2 Classification for Known Component Parameters 262.2.1 Bayes’ Rule for Classifying a Single Observation 262.2.2 The Bayes’ Classifier for a Whole Data Set 27<font color="#.3 Parameter Estimation for Known Allocation 29<font color="#.3.1 The Complete-Data Likelihood Function 29<font color="#.3.2 Complete-Data Maximum Likelihood Estimation 30<font color="#.3.3 Complete-Data Bayesian Estimation of the Component Parameters 31<font color="#.3.4 Complete-Data Bayesian Estimation of the Weights 35<font color="#.4 Parameter Estimation When the Allocations Are Unknown 41<font color="#.4.1 Method of Moments 42
<font color="#.4.2 The Mixture Likelihood Function 43<font color="#.4.3 A Helicopter Tour of the Mixture Likelihood Surface for Two Examples 44<font color="#.4.4 Maximum Likelihood Estimation 49<font color="#.4.5 Bayesian Parameter Estimation 53<font color="#.4.6 Distance-Based Methods 54<font color="#.4.7 Comparing Various Estimation Methods 54
为了幸福,努力!
阅读权限28威望0 级论坛币9267 个学术水平24 点热心指数34 点信用等级16 点经验19274 点帖子435精华0在线时间264 小时注册时间最后登录
积分 966, 距离下一级还需 409 积分
权限: 自定义头衔, 签名中使用图片, 隐身
道具: 彩虹炫, 雷达卡, 热点灯, 雷鸣之声, 涂鸦板, 金钱卡, 显身卡, 匿名卡, 抢沙发下一级可获得
权限: 设置帖子权限道具: 提升卡
购买后可立即获得
权限: 隐身
道具: 金钱卡, 雷鸣之声, 彩虹炫, 雷达卡, 涂鸦板, 热点灯
开心签到天数: 3 天连续签到: 1 天[LV.2]偶尔看看I
本帖最后由 kxjs2007 于
15:52 编辑
3 Practical Bayesian Inference for a Finite Mixture Model
with Known Number of Components 57<font color="#.1 Introduction 57<font color="#.2 Choosing the Prior for the Parameters of a Mixture Model 58<font color="#.2.1 Objective and Subjective Priors 58<font color="#.2.2 Improper Priors May Cause Improper Mixture Posteriors 59<font color="#.2.3 Conditionally Conjugate Priors 60<font color="#.2.4 Hierarchical Priors and Partially Proper Priors 61<font color="#.2.5 Other Priors 62<font color="#.2.6 Invariant Prior Distributions 62<font color="#.3 Some Properties of the Mixture Posterior Density 63<font color="#.3.1 Invariance of the Posterior Distribution 63<font color="#.3.2 Invariance of Seemingly Component-Specific Functionals 64<font color="#.3.3 The Marginal Posterior Distribution of the Allocations 65<font color="#.3.4 Invariance of the Posterior Distribution of the Allocations 67<font color="#.4 Classification Without Parameter Estimation 68<font color="#.4.1 Single-Move Gibbs Sampling 693.4.2 The Metropolis–Hastings Algorithm 72<font color="#.5 Parameter Estimation Through Data Augmentation and MCMC 73<font color="#.5.1 Treating Mixture Models as a Missing Data Problem 73<font color="#.5.2 Data Augmentation and MCMC for a Mixture of Poisson Distributions 74<font color="#.5.3 Data Augmentation and MCMC for General Mixtures 76<font color="#.5.4 MCMC Sampling Under Improper Priors 78<font color="#.5.5 Label Switching 78<font color="#.5.6 Permutation MCMC Sampling 81<font color="#.6 Other Monte Carlo Methods Useful for Mixture Models 833.6.1 A Metropolis–Hastings Algorithm for the Parameters 83<font color="#.6.2 Importance Sampling for the Allocations 84<font color="#.6.3 Perfect Sampling 85
<font color="#.7 Bayesian Inference for Finite Mixture Models Using Posterior Draws 85<font color="#.7.1 Sampling Representations of the Mixture Posterior Density 85<font color="#.7.2 Using Posterior Draws for Bayesian Inference 87<font color="#.7.3 Predictive Density Estimation 89<font color="#.7.4 Individual Parameter Inference 91<font color="#.7.5 Inference on the Hyperparameter of a Hierarchical Prior 92<font color="#.7.6 Inference on Component Parameters 92<font color="#.7.7 Model Identification 94
为了幸福,努力!
阅读权限28威望0 级论坛币9267 个学术水平24 点热心指数34 点信用等级16 点经验19274 点帖子435精华0在线时间264 小时注册时间最后登录
积分 966, 距离下一级还需 409 积分
权限: 自定义头衔, 签名中使用图片, 隐身
道具: 彩虹炫, 雷达卡, 热点灯, 雷鸣之声, 涂鸦板, 金钱卡, 显身卡, 匿名卡, 抢沙发下一级可获得
权限: 设置帖子权限道具: 提升卡
购买后可立即获得
权限: 隐身
道具: 金钱卡, 雷鸣之声, 彩虹炫, 雷达卡, 涂鸦板, 热点灯
开心签到天数: 3 天连续签到: 1 天[LV.2]偶尔看看I
4 Statistical Inference for Finite Mixture Models Under
Model Specification Uncertainty 99<font color="#.1 Introduction 99<font color="#.2 Parameter Estimation Under Model Specification Uncertainty 100<font color="#.2.1 Maximum Likelihood Estimation Under Model Specification Uncertainty 100<font color="#.2.2 Practical Bayesian Parameter Estimation for Overfitting Finite Mixture Models 103<font color="#.2.3 Potential Overfitting 105<font color="#.3 Informal Methods for Identifying the Number of Components 107<font color="#.3.1 Mode Hunting in the Mixture Posterior 108<font color="#.3.2 Mode Hunting in the Sample Histogram 109<font color="#.3.3 Diagnosing Mixtures Through the Method of Moments 110<font color="#.3.4 Diagnosing Mixtures Through Predictive Methods 112<font color="#.3.5 Further Approaches 114<font color="#.4 Likelihood-Based Methods 114<font color="#.4.1 The Likelihood Ratio Statistic 114<font color="#.4.2 AIC, BIC, and the Schwarz Criterion 116<font color="#.4.3 Further Approaches 117<font color="#.5 Bayesian Inference Under Model Uncertainty 117<font color="#.5.1 Trans-Dimensional Bayesian Inference 117<font color="#.5.2 Marginal Likelihoods 118<font color="#.5.3 Bayes Factors for Model Comparison 119<font color="#.5.4 Formal Bayesian Model Selection 121<font color="#.5.5 Choosing Priors for Model Selection 122<font color="#.5.6 Further Approaches 123
5 Computational Tools for Bayesian Inference for Finite
Mixtures Models Under Model Specification Uncertainty 125<font color="#.1 Introduction 125<font color="#.2 Trans-Dimensional Markov Chain Monte Carlo Methods 125<font color="#.2.1 Product-Space MCMC 126<font color="#.2.2 Reversible Jump MCMC 129<font color="#.2.3 Birth and Death MCMC Methods 137<font color="#.3 Marginal Likelihoods for Finite Mixture Models 139<font color="#.3.1 Defining the Marginal Likelihood 139<font color="#.3.2 Choosing Priors for Selecting the Number of Components 141<font color="#.3.3 Computation of the Marginal Likelihood for Mixture Models 143<font color="#.4 Simulation-Based Approximations of the Marginal Likelihood 143<font color="#.4.1 Some Background on Monte Carlo Integration 143<font color="#.4.2 Sampling-Based Approximations for Mixture Models 144<font color="#.4.3 Importance Sampling 146<font color="#.4.4 Reciprocal Importance Sampling 147<font color="#.4.5 Harmonic Mean Estimator 148<font color="#.4.6 Bridge Sampling Technique 150<font color="#.4.7 Comparison of Different Simulation-Based Estimators 154<font color="#.4.8 Dealing with Hierarchical Priors 159<font color="#.5 Approximations to the Marginal Likelihood Based on Density Ratios 159<font color="#.5.1 The Posterior Density Ratio 1595.5.2 Chib’s Estimator 160<font color="#.5.3 Laplace Approximation 164<font color="#.6 Reversible Jump MCMC Versus Marginal Likelihoods? 165
为了幸福,努力!
阅读权限28威望0 级论坛币9267 个学术水平24 点热心指数34 点信用等级16 点经验19274 点帖子435精华0在线时间264 小时注册时间最后登录
积分 966, 距离下一级还需 409 积分
权限: 自定义头衔, 签名中使用图片, 隐身
道具: 彩虹炫, 雷达卡, 热点灯, 雷鸣之声, 涂鸦板, 金钱卡, 显身卡, 匿名卡, 抢沙发下一级可获得
权限: 设置帖子权限道具: 提升卡
购买后可立即获得
权限: 隐身
道具: 金钱卡, 雷鸣之声, 彩虹炫, 雷达卡, 涂鸦板, 热点灯
开心签到天数: 3 天连续签到: 1 天[LV.2]偶尔看看I
6 Finite Mixture Models with Normal Components 169<font color="#.1 Finite Mixtures of Normal Distributions 169<font color="#.1.1 Model Formulation 169<font color="#.1.2 Parameter Estimation for Mixtures of Normals 1716.1.3 The Kiefer–Wolfowitz Example 174<font color="#.1.4 Applications of Mixture of Normal Distributions 176<font color="#.2 Bayesian Estimation of Univariate Mixtures of Normals 177<font color="#.2.1 Bayesian Inference When the Allocations Are Known 177<font color="#.2.2 Standard Prior Distributions 179<font color="#.2.3 The Influence of the Prior on the Variance Ratio 179<font color="#.2.4 Bayesian Estimation Using MCMC 180<font color="#.2.5 MCMC Estimation Under Standard Improper Priors 182
<font color="#.2.6 Introducing Prior Dependence Among the Components185<font color="#.2.7 Further Sampling-Based Approaches 187<font color="#.2.8 Application to the Fishery Data 188<font color="#.3 Bayesian Estimation of Multivariate Mixtures of Normals 190<font color="#.3.1 Bayesian Inference When the Allocations Are Known 190<font color="#.3.2 Prior Distributions 192<font color="#.3.3 Bayesian Parameter Estimation Using MCMC 1936.3.4 Application to Fisher’s Iris Data 195<font color="#.4 Further Issues 195<font color="#.4.1 Parsimonious Finite Normal Mixtures 195<font color="#.4.2 Model Selection Problems for Mixtures of Normals 1997 Data Analysis Based on Finite Mixtures 203<font color="#.1 Model-Based Clustering 203<font color="#.1.1 Some Background on Cluster Analysis 203<font color="#.1.2 Model-Based Clustering Using Finite Mixture Models 204<font color="#.1.3 The Classification Likelihood and the Bayesian MAP Approach 207<font color="#.1.4 Choosing Clustering Criteria and the Number of Components 210<font color="#.1.5 Model Choice for the Fishery Data 2167.1.6 Model Choice for Fisher’s Iris Data 218<font color="#.1.7 Bayesian Clustering Based on Loss Functions 2207.1.8 Clustering for Fisher’s Iris Data 224<font color="#.2 Outlier Modeling 224<font color="#.2.1 Outlier Modeling Using Finite Mixtures 224<font color="#.2.2 Bayesian Inference for Outlier Models Based on Finite Mixtures 2257.2.3 Outlier Modeling of Darwin’s Data 226<font color="#.2.4 Clustering Under Outliers and Noise 2277.3 Robust Finite Mixtures Based on the Student-t Distribution 230<font color="#.3.1 Parameter Estimation230<font color="#.3.2 Dealing with Unknown Number of Components 233<font color="#.4 Further Issues 233<font color="#.4.1 Clustering High-Dimensional Data 233<font color="#.4.2 Discriminant Analysis 235<font color="#.4.3 Combining Classified and Unclassified Observations 236<font color="#.4.4 Density Estimation Using Finite Mixtures 237<font color="#.4.5 Finite Mixtures as an Auxiliary Computational Tool in Bayesian Analysis 238
为了幸福,努力!
阅读权限28威望0 级论坛币9267 个学术水平24 点热心指数34 点信用等级16 点经验19274 点帖子435精华0在线时间264 小时注册时间最后登录
积分 966, 距离下一级还需 409 积分
权限: 自定义头衔, 签名中使用图片, 隐身
道具: 彩虹炫, 雷达卡, 热点灯, 雷鸣之声, 涂鸦板, 金钱卡, 显身卡, 匿名卡, 抢沙发下一级可获得
权限: 设置帖子权限道具: 提升卡
购买后可立即获得
权限: 隐身
道具: 金钱卡, 雷鸣之声, 彩虹炫, 雷达卡, 涂鸦板, 热点灯
开心签到天数: 3 天连续签到: 1 天[LV.2]偶尔看看I
8 Finite Mixtures of Regression Models 241<font color="#.1 Introduction 241<font color="#.2 Finite Mixture of Multiple Regression Models 242<font color="#.2.1 Model Definition 242<font color="#.2.2 Identifiability 243<font color="#.2.3 Statistical Modeling Based on Finite Mixture of Regression Models 246<font color="#.2.4 Outliers in a Regression Model 249<font color="#.3 Statistical Inference for Finite Mixtures of Multiple Regression Models 249<font color="#.3.1 Maximum Likelihood Estimation 249<font color="#.3.2 Bayesian Inference When the Allocations Are Known 250<font color="#.3.3 Choosing Prior Distributions 252<font color="#.3.4 Bayesian Inference When the Allocations Are Unknown 253<font color="#.3.5 Bayesian Inference Using Posterior Draws 254<font color="#.3.6 Dealing with Model Specification Uncertainty 255<font color="#.4 Mixed-Effects Finite Mixtures of Regression Models 256<font color="#.4.1 Model Definition 256<font color="#.4.2 Choosing Priors for Bayesian Estimation 256<font color="#.4.3 Bayesian Parameter Estimation When the Allocations Are Known 257<font color="#.4.4 Bayesian Parameter Estimation When the Allocations Are Unknown 258<font color="#.5 Finite Mixture Models for Repeated Measurements 259<font color="#.5.1 Pooling Information Across Similar Units 260<font color="#.5.2 Finite Mixtures of Random-Effects Models 260<font color="#.5.3 Choosing the Prior for Bayesian Estimation 265<font color="#.5.4 Bayesian Parameter Estimation When the Allocations Are Known 265<font color="#.5.5 Practical Bayesian Estimation Using MCMC267<font color="#.5.6 Dealing with Model Specification Uncertainty 269<font color="#.5.7 Application to the Marketing Data 270<font color="#.6 Further Issues 273<font color="#.6.1 Regression Modeling Based on Multivariate Mixtures of Normals 273<font color="#.6.2 Modeling the Weight Distribution 274<font color="#.6.3 Mixtures-of-Experts Models 2749 Finite Mixture Models with Nonnormal Components 277<font color="#.1 Finite Mixtures of Exponential Distributions 277<font color="#.1.1 Model Formulation and Parameter Estimation 277<font color="#.1.2 Bayesian Inference 278<font color="#.2 Finite Mixtures of Poisson Distributions 279
<font color="#.2.1 Model Formulation and Estimation 279<font color="#.2.2 Capturing Overdispersion in Count Data 280<font color="#.2.3 Modeling Excess Zeros 282<font color="#.2.4 Application to the Eye Tracking Data 283<font color="#.3 Finite Mixture Models for Binary and Categorical Data 286<font color="#.3.1 Finite Mixtures of Binomial Distributions 286<font color="#.3.2 Finite Mixtures of Multinomial Distributions 288<font color="#.4 Finite Mixtures of Generalized Linear Models 289<font color="#.4.1 Finite Mixture Regression Models for Count Data 290<font color="#.4.2 Finite Mixtures of Logit and Probit Regression Models 292<font color="#.4.3 Parameter Estimation for Finite Mixtures of GLMs 293<font color="#.4.4 Model Selection for Finite Mixtures of GLMs 294<font color="#.5 Finite Mixture Models for Multivariate Binary and Categorical Data 294<font color="#.5.1 The Basic Latent Class Model 295<font color="#.5.2 Identification and Parameter Estimation 296<font color="#.5.3 Extensions of the Basic Latent Class Model 297<font color="#.6 Further Issues 298<font color="#.6.1 Finite Mixture Modeling of Mixed-Mode Data 298<font color="#.6.2 Finite Mixtures of GLMs with Random Effects 299
为了幸福,努力!
阅读权限28威望0 级论坛币9267 个学术水平24 点热心指数34 点信用等级16 点经验19274 点帖子435精华0在线时间264 小时注册时间最后登录
积分 966, 距离下一级还需 409 积分
权限: 自定义头衔, 签名中使用图片, 隐身
道具: 彩虹炫, 雷达卡, 热点灯, 雷鸣之声, 涂鸦板, 金钱卡, 显身卡, 匿名卡, 抢沙发下一级可获得
权限: 设置帖子权限道具: 提升卡
购买后可立即获得
权限: 隐身
道具: 金钱卡, 雷鸣之声, 彩虹炫, 雷达卡, 涂鸦板, 热点灯
开心签到天数: 3 天连续签到: 1 天[LV.2]偶尔看看I
10 Finite Markov Mixture Modeling 301<font color="#.1 Introduction 301<font color="#.2 Finite Markov Mixture Distributions 301<font color="#.2.1 Basic Definitions 302<font color="#.2.2 Irreducible Aperiodic Markov Chains 304<font color="#.2.3 Moments of a Markov Mixture Distribution 308<font color="#.2.4 The Autocorrelation Function of a Process Generated by a Markov Mixture Distribution 310<font color="#.2.5 The Autocorrelation Function of the Squared Process 311<font color="#.2.6 The Standard Finite Mixture Distribution as a Limiting Case 312<font color="#.2.7 Identifiability of a Finite Markov Mixture Distribution313<font color="#.3 Statistical Modeling Based on Finite Markov Mixture Distributions314<font color="#.3.1 The Basic Markov Switching Model 314<font color="#.3.2 The Markov Switching Regression Model 315<font color="#.3.3 Nonergodic Markov Chains 316<font color="#.3.4 Relaxing the Assumptions of the Basic Markov Switching Model 31611 Statistical Inference for Markov Switching Models 319<font color="#.1 Introduction 319<font color="#.2 State Estimation for Known Parameters 319<font color="#.2.1 Statistical Inference About the States 320<font color="#.2.2 Filtered State Probabilities 320<font color="#.2.3 Filtering for Special Cases 323<font color="#.2.4 Smoothing the States 324<font color="#.2.5 Filtering and Smoothing for More General Models 326<font color="#.3 Parameter Estimation for Known States 327<font color="#.3.1 The Complete-Data Likelihood Function 327<font color="#.3.2 Complete-Data Bayesian Parameter Estimation 329<font color="#.3.3 Complete-Data Bayesian Estimation of the Transition Matrix 329<font color="#.4 Parameter Estimation When the States are Unknown 330<font color="#.4.1 The Markov Mixture Likelihood Function 330<font color="#.4.2 Maximum Likelihood Estimation 333<font color="#.4.3 Bayesian Estimation 334<font color="#.4.4 Alternative Estimation Methods 334<font color="#.5 Bayesian Parameter Estimation with Known Number of States 335<font color="#.5.1 Choosing the Prior for the Parameters of a Markov Mixture Model 335<font color="#.5.2 Some Properties of the Posterior Distribution of a Markov Switching Model 336<font color="#.5.3 Parameter Estimation Through Data Augmentation and MCMC 337<font color="#.5.4 Permutation MCMC Sampling 340<font color="#.5.5 Sampling the Unknown Transition Matrix 340<font color="#.5.6 Sampling Posterior Paths of the Hidden Markov Chain 342<font color="#.5.7 Other Sampling-Based Approaches 345<font color="#.5.8 Bayesian Inference Using Posterior Draws 345<font color="#.6 Statistical Inference Under Model Specification Uncertainty 346<font color="#.6.1 Diagnosing Markov Switching Models 346<font color="#.6.2 Likelihood-Based Methods 346<font color="#.6.3 Marginal Likelihoods for Markov Switching Models 347<font color="#.6.4 Model Space MCMC 348<font color="#.6.5 Further Issues 348<font color="#.7 Modeling Overdispersion and Autocorrelation in Time Series of Count Data 348<font color="#.7.1 Motivating Example 348<font color="#.7.2 Capturing Overdispersion and Autocorrelation Using Poisson Markov Mixture Models 349<font color="#.7.3 Application to the Lamb Data 351
为了幸福,努力!
阅读权限28威望0 级论坛币9267 个学术水平24 点热心指数34 点信用等级16 点经验19274 点帖子435精华0在线时间264 小时注册时间最后登录
积分 966, 距离下一级还需 409 积分
权限: 自定义头衔, 签名中使用图片, 隐身
道具: 彩虹炫, 雷达卡, 热点灯, 雷鸣之声, 涂鸦板, 金钱卡, 显身卡, 匿名卡, 抢沙发下一级可获得
权限: 设置帖子权限道具: 提升卡
购买后可立即获得
权限: 隐身
道具: 金钱卡, 雷鸣之声, 彩虹炫, 雷达卡, 涂鸦板, 热点灯
开心签到天数: 3 天连续签到: 1 天[LV.2]偶尔看看I
12 Nonlinear Time Series Analysis Based on Markov
Switching Models 357<font color="#.1 Introduction 357<font color="#.2 The Markov Switching Autoregressive Model 358<font color="#.2.1 Motivating Example 358<font color="#.2.2 Model Definition 360<font color="#.2.3 Features of the MSAR Model 362<font color="#.2.4 Markov Switching Models for Nonstationary Time Series 363<font color="#.2.5 Parameter Estimation and Model Selection 365<font color="#.2.6 Application to Business Cycle Analysis of the U.S.GDP Data 365<font color="#.3 Markov Switching Dynamic Regression Models 371<font color="#.3.1 Model Definition 371<font color="#.3.2 Bayesian Estimation 371<font color="#.4 Prediction of Time Series Based on Markov Switching Models 372<font color="#.4.1 Flexible Predictive Distributions 372<font color="#.4.2 Forecasting of Markov Switching Models via Sampling-Based Methods 374<font color="#.5 Markov Switching Conditional Heteroscedasticity 375<font color="#.5.1 Motivating Example 375<font color="#.5.2 Capturing Features of Financial Time Series Through Markov Switching Models 377<font color="#.5.3 Switching ARCH Models 378<font color="#.5.4 Statistical Inference for Switching ARCH Models 380<font color="#.5.5 Switching GARCH Models 383<font color="#.6 Some Extensions 384<font color="#.6.1 Time-Varying Transition Matrices 384<font color="#.6.2 Markov Switching Models for Longitudinal and Panel Data 385<font color="#.6.3 Markov Switching Models for Multivariate Time Series 38613 Switching State Space Models 389<font color="#.1 State Space Modeling 389<font color="#.1.1 The Local Level Model with and Without Switching 389<font color="#.1.2 The Linear Gaussian State Space Form 391<font color="#.1.3 Multiprocess Models 393<font color="#.1.4 Switching Linear Gaussian State Space Models 393<font color="#.1.5 The General State Space Form 394<font color="#.2 Nonlinear Time Series Analysis Based on Switching State Space Models 396<font color="#.2.1 ARMA Models with and Without Switching 396
<font color="#.2.2 Unobserved Component Time Series Models 397<font color="#.2.3 Capturing Sudden Changes in Time Series 398<font color="#.2.4 Switching Dynamic Factor Models 400<font color="#.2.5 Switching State Space Models as a Semi-Parametric Smoothing Device 401<font color="#.3 Filtering for Switching Linear Gaussian State Space Models 401<font color="#.3.1 The Filtering Problem 402<font color="#.3.2 Bayesian Inference for a General Linear Regression Model402<font color="#.3.3 Filtering for the Linear Gaussian State Space Model 404<font color="#.3.4 Filtering for Multiprocess Models 406<font color="#.3.5 Approximate Filtering for Switching Linear Gaussian State Space Models 406<font color="#.4 Parameter Estimation for Switching State Space Models 410<font color="#.4.1 The Likelihood Function of a State Space Model 411<font color="#.4.2 Maximum Likelihood Estimation 412<font color="#.4.3 Bayesian Inference 412<font color="#.5 Practical Bayesian Estimation Using MCMC 415<font color="#.5.1 Various Data Augmentation Schemes 416<font color="#.5.2 Sampling the Continuous State Process from the Smoother Density 417<font color="#.5.3 Sampling the Discrete States for a Switching State Space Model 420<font color="#.6 Further Issues 421<font color="#.6.1 Model Specification Uncertainty in Switching State Space Modeling 421<font color="#.6.2 Auxiliary Mixture Sampling for Nonlinear and Nonnormal State Space Models 422<font color="#.7 Illustrative Application to Modeling Exchange Rate Data 423
为了幸福,努力!
阅读权限28威望0 级论坛币9267 个学术水平24 点热心指数34 点信用等级16 点经验19274 点帖子435精华0在线时间264 小时注册时间最后登录
积分 966, 距离下一级还需 409 积分
权限: 自定义头衔, 签名中使用图片, 隐身
道具: 彩虹炫, 雷达卡, 热点灯, 雷鸣之声, 涂鸦板, 金钱卡, 显身卡, 匿名卡, 抢沙发下一级可获得
权限: 设置帖子权限道具: 提升卡
购买后可立即获得
权限: 隐身
道具: 金钱卡, 雷鸣之声, 彩虹炫, 雷达卡, 涂鸦板, 热点灯
开心签到天数: 3 天连续签到: 1 天[LV.2]偶尔看看I
A Appendix 431A.1 Summary of Probability Distributions 431A.1.1 The Beta Distribution 431A.1.2 The Binomial Distribution432A.1.3 The Dirichlet Distribution 432A.1.4 The Exponential Distribution 433A.1.5 The F-Distribution 433A.1.6 The Gamma Distribution434A.1.7 The Geometric Distribution 435A.1.8 The Multinomial Distribution 435A.1.9 The Negative Binomial Distribution 435A.1.10 The Normal Distribution 436A.1.11 The Poisson Distribution 437A.1.12 The Student-t Distribution 437A.1.13 The Uniform Distribution 438
A.1.14 The Wishart Distribution 438A.2 Software 439References 441Index 481
为了幸福,努力!
论坛好贴推荐

我要回帖

更多关于 htc 涂鸦板 的文章

 

随机推荐