Normal Mixture
Let us study an unsupervised learning using a normal mixture. A normal mixture is defined by a finite mixture of a normal distribution.
An unknown data generating distribution is estimated by a normal mixture.
We examine a normal mixture and its prior distribution which are defined by these equations.
![](https://www.google.com/images/icons/product/drive-32.png)
This is an example of Bayes Estimation.
Data generating distribution is a normal mixture with 3 components, whereas a learning machine is that with 2 components.
A sample size increases gradually. Generalization error, LOOCV, and WAIC are compared.
![](https://www.google.com/images/icons/product/drive-32.png)
This is an example of Bayes Estimation.
Data generating distribution is a normal mixture with 3 components, whereas a learning machine is that with 3 components.
A sample size increases gradually. Generalization error, LOOCV, and WAIC are compared.
![](https://www.google.com/images/icons/product/drive-32.png)
This is an example of Bayes Estimation.
Data generating distribution is a normal mixture with 3 components, whereas a learning machine is that with 5 components.
A sample size increases gradually. Generalization error, LOOCV, and WAIC are compared.