Commit fa3ce8e0 authored by Florent Chatelain's avatar Florent Chatelain
Browse files

Merge branch 'master' of gricad-gitlab.univ-grenoble-alpes.fr:chatelaf/ml-sicom3a

parents 7c8b465c ab16e392
......@@ -2,12 +2,22 @@
## `News`
- For the interested students, we can find [here](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/tree/master/notebooks/X_deep_learning) two demo/tutorial notebooks on deep learning for image classification (convolutional neural nets) with the tensorflow 2.X platform and Keras API.
- For students who have to stay at home for health reasons, and *only those who can't attend the face-to-face course*, there is a zoom link (see the [chamilo page](https://chamilo.grenoble-inp.fr/courses/PHELMA5PMSAST6/index.php?) of the course) to participate in videoconference to the class every monday from 15:45 to 17:45.
- For students who have to stay at home for health reasons, and *only those who can't attend the face-to-face course*, there is a zoom link (see the [chamilo page](https://chamilo.grenoble-inp.fr/courses/PHELMA5PMSAST6/index.php?) of the course) to participate in
videoconference to the class every monday from 15:45 to 17:45.
##### Homework for **Monday, October 19**
- Finish reading/understanding the notebooks you didn't cover in the previous lab session.
- **read the lesson** ([slides](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/slides/7_clustering.pdf)) on clustering (unsupervised classification): **read up to kernel K-means on slide 34**
- prepare your questions for the course/lab session!
##### Lab7 instructions (Friday, October 23)
- Lab7 statement on decision trees and random forests is [here](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/labs/lab7_statement.md)
- Upload **at the end of the session** your lab 7 *short report* in the [chamilo assigment task](https://chamilo.grenoble-inp.fr/main/work/work_list.php?cidReq=PHELMA5PMSAST6&id_session=0&gidReq=0&gradebook=0&origin=&id=123903) (pdf file from your editor, or scanned pdf file of a handwritten paper; code, figures or graphics are not required)
##### ~~Lab6 instructions (Monday, October 19)~~
- ~~Lab6 statement on clustering is [here](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/labs/lab6_statement.md)~~
- ~~Upload **at the end of the session** your lab 6 *short report* in the [chamilo assigment task](https://chamilo.grenoble-inp.fr/main/work/work_list.php?cidReq=PHELMA5PMSAST6&id_session=0&gidReq=0&gradebook=0&origin=&id=123902) (pdf file from your editor, or scanned pdf file of a handwritten paper; code, figures or graphics are not required)~~
##### ~~Homework for **Monday, October 19**~~
- ~~Finish reading/understanding the notebooks you didn't cover in the previous lab session.~~
- ~~**read the lesson** ([slides](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/slides/7_clustering.pdf)) on clustering (unsupervised classification): **read up to kernel K-means on slide 34**~~
- ~~prepare your questions for the course/lab session!~~
##### ~~Lab5 instructions~~
......
# Lab 6 statement
The objective of this lab is to illustrate basic concepts of clustering.
_Note: For each notebook, read the cells and run the code, then follow the instructions/questions in the `questions` or `Exercise` cells._
See the notebooks in the [`7_Clustering`](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/notebooks/7_Clustering/) folder
1. Implement your own version of Kmeans in 1D for Euclidean distance [`N1_Kmeans_basic.ipynb`](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/notebooks/7_Clustering/N1_Kmeans_basic.ipynb)
and compare the obtained results with Sklearn Kmeans implementation.
2. Apply Kmeans algorithm (Sklearn implementation) to the classical Iris dataset [`N2_KMeans_iris_data_example.ipynb`](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/notebooks/7_Clustering/N2_KMeans_iris_data_example.ipynb)
3. Implement a Kernelized version of Kmeans, and test the importance of adequate parametrization or choice of initial conditions
[`N3_Kernel_Kmeans_example.ipynb`](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/notebooks/7_Clustering/N3_Kernel_Kmeans_example.ipynb)
4. Implement your own version of EM for Gaussian model and apply it to the same example used for Kmeans in a preceeding notebook. Compare with KMeans and interpret the results [`N4_EM_basic.ipynb`](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/notebooks/7_Clustering/N4_EM_basic.ipynb)
5. Example of EM application on the Iris data set [`N5_EM_iris_data_example.ipynb`](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/notebooks/7_Clustering/N5_EM_iris_data_example.ipynb)
\ No newline at end of file
# Lab 7 statement
The objective of this lab is to illustrate Tree based classification and regression methods. Forest trees are introduced as a natural bagging method.
_Note: For each notebook, read the cells and run the code, then follow the instructions/questions in the questions` or `Exercise` cells._
See the notebooks in the [`8_Trees_Boosting`](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/notebooks/8_Trees_Boosting/) folder
1. Firsts steps with classification trees [`N1_Classif_tree.ipynb`](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/notebooks/8_Trees_Boosting/N1_Classif_tree.ipynb)
2. Examples of regression trees [`N2_a_Regression_tree.ipynb`](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/notebooks/8_Trees_Boosting/N2_a_Regression_tree.ipynb) and cost complexity pruning methods [`N2_b_Cost_Complexity_Pruning_Regressor.ipynb`](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/notebooks/8_Trees_Boosting/N2_b_Cost_Complexity_Pruning_Regressor.ipynb)
3.The 3 notebooks below illstrate the concept of bagging through the application of random forests
[`N2_a_Regression_tree.ipynb`](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/notebooks/8_Trees_Boosting/N2_a_Regression_tree.ipynb)
4. Implement your own version of EM for Gaussian model and apply it to the same example used for Kmeans in a preceeding notebook. Compare with KMeans and interpret the results [`N2_a_Regression_tree.ipynb`](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/notebooks/8_Trees_Boosting/N2_a_Regression_tree.ipynb)
5. Example of EM application on the Iris data set [`N2_a_Regression_tree.ipynb`](https://gricad-gitlab.univ-grenoble-alpes.fr/chatelaf/ml-sicom3a/-/blob/master/notebooks/8_Trees_Boosting/N2_a_Regression_tree.ipynb)
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment