@@ -13,13 +13,8 @@ See the notebooks in the [`8_Trees_Boosting`](https://gricad-gitlab.univ-grenobl
2. Examples of regression trees [`N2_a_Regression_tree.ipynb`](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/notebooks/8_Trees_Boosting/N2_a_Regression_tree.ipynb) and cost complexity pruning methods [`N2_b_Cost_Complexity_Pruning_Regressor.ipynb`](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/notebooks/8_Trees_Boosting/N2_b_Cost_Complexity_Pruning_Regressor.ipynb)
3.The 3 notebooks below illstrate the concept of bagging through the application of random forests
3.. The next 3 notebooks llstrate the concept of bagging through the application of random forests.
[`N3_a_Random_Forest_Regression_tree.ipynb`](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/notebooks/8_Trees_Boosting/N3_a_Random_Forest_Regression.ipynb) must be completed. The next 2 Notebooks (N3_b and N3_c are optionnal, and describe some applications on real data.
4. Implement your own version of EM for Gaussian model and apply it to the same example used for Kmeans in a preceeding notebook. Compare with KMeans and interpret the results [`N2_a_Regression_tree.ipynb`](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/notebooks/8_Trees_Boosting/N2_a_Regression_tree.ipynb)
5. Example of EM application on the Iris data set [`N2_a_Regression_tree.ipynb`](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/notebooks/8_Trees_Boosting/N2_a_Regression_tree.ipynb)