Skip to content
Snippets Groups Projects
Commit 24266241 authored by Jean-Luc Parouty's avatar Jean-Luc Parouty
Browse files

Update README for sequence 6

parent bb7aa4e4
No related branches found
No related tags found
No related merge requests found
%% Cell type:code id:metallic-symphony tags: %% Cell type:code id:regional-acrobat tags:
``` python ``` python
from IPython.display import display,Markdown from IPython.display import display,Markdown
display(Markdown(open('README.md', 'r').read())) display(Markdown(open('README.md', 'r').read()))
# #
# This README is visible under Jupiter LAb ! :-) # This README is visible under Jupiter LAb ! :-)
``` ```
%% Output %% Output
<a name="top"></a> <a name="top"></a>
[<img width="600px" src="fidle/img/00-Fidle-titre-01.svg"></img>](#top) [<img width="600px" src="fidle/img/00-Fidle-titre-01.svg"></img>](#top)
<!-- --------------------------------------------------- --> <!-- --------------------------------------------------- -->
<!-- To correctly view this README under Jupyter Lab --> <!-- To correctly view this README under Jupyter Lab -->
<!-- Open the notebook: README.ipynb! --> <!-- Open the notebook: README.ipynb! -->
<!-- --------------------------------------------------- --> <!-- --------------------------------------------------- -->
## Session Fidle à distance (NEW !) ## Session Fidle à distance (NEW !)
Faute de pouvoir organiser des sessions en présentiel,\ Faute de pouvoir organiser des sessions en présentiel,\
nous vous proposons une version **Fidle à distance** :-) nous vous proposons une version **Fidle à distance** :-)
**- Voir ou revoir en ligne -** **- Voir ou revoir en ligne -**
https://www.youtube.com/channel/UC4Sukzudhbwr6fs10cXrJsQ https://www.youtube.com/channel/UC4Sukzudhbwr6fs10cXrJsQ
**- Prochain rendez-vous -** **- Prochain rendez-vous -**
|**Jeudi 18 mars, 14h, Séquence 5 : <br>Les réseaux autoencodeurs (AE), un exemple d'apprentissage non supervisé.**| |**Jeudi 25 mars, 14h, Séquence 6 : <br>Du réseau autoencodeur (AE) au Variational Autoencoder (VAE)<br>ou comment parcourir l'imagination de nos réseaux.**|
|--| |--|
|Principes et architecture des réseaux autoencodeurs.<br> Convolutions classiques et transposées - Espaces latents.<br> Programmation procédurale avec Keras - GPU et batch.<br>Exemple proposé :<br>Débruitage d'images fortement bruitées.<br>Où, comment à partir de ces image :<br> ![AE2-07-test-noisy-xs](./fidle/img/AE2-noisy.png)<br> Il est possible de retrouver celles-ci !<br>![AE2-08-test-predict-xs](./fidle/img/AE2-predict.png) | |Principes et architecture d'un Variational Autoencoder (VAE).<br>Problématiques liées à la gestion de "gros" datasets.<br>Projection gaussienne - Génération de données - Morphing dans l'espace latent.<br>Programmation avancée avec Keras - Datasets clusterisés.<br>Exemple proposé :<br>Mise en œuvre d'un VAE, génération de données et morphing dans l'espace latent.<br>![VAE](./fidle/img/VAE.jpg )|
|Durée : 2h - Les paramètres de diffusion seront [précisés la veille dans le wiki](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Fidle%20%C3%A0%20distance/En%20bref) | |Durée : 2h - Les paramètres de diffusion seront [précisés la veille dans le wiki](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Fidle%20%C3%A0%20distance/En%20bref) |
A propos de **[Fidle à distance](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Fidle%20%C3%A0%20distance/En%20bref)**\ A propos de **[Fidle à distance](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Fidle%20%C3%A0%20distance/En%20bref)**\
Voir le [programme](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Fidle%20%C3%A0%20distance/Pr%C3%A9sentation#programme-)\ Voir le [programme](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Fidle%20%C3%A0%20distance/Pr%C3%A9sentation#programme-)\
Voir ou revoir les [vidéos](https://www.youtube.com/channel/UC4Sukzudhbwr6fs10cXrJsQ) Voir ou revoir les [vidéos](https://www.youtube.com/channel/UC4Sukzudhbwr6fs10cXrJsQ)
## About Fidle ## About Fidle
This repository contains all the documents and links of the **Fidle Training** . This repository contains all the documents and links of the **Fidle Training** .
Fidle (for Formation Introduction au Deep Learning) is a 2-day training session Fidle (for Formation Introduction au Deep Learning) is a 2-day training session
co-organized by the Formation Permanente CNRS and the Resinfo/SARI and DevLOG CNRS networks. co-organized by the Formation Permanente CNRS and the Resinfo/SARI and DevLOG CNRS networks.
The objectives of this training are : The objectives of this training are :
- Understanding the **bases of Deep Learning** neural networks - Understanding the **bases of Deep Learning** neural networks
- Develop a **first experience** through simple and representative examples - Develop a **first experience** through simple and representative examples
- Understanding **Tensorflow/Keras** and **Jupyter lab** technologies - Understanding **Tensorflow/Keras** and **Jupyter lab** technologies
- Apprehend the **academic computing environments** Tier-2 or Tier-1 with powerfull GPU - Apprehend the **academic computing environments** Tier-2 or Tier-1 with powerfull GPU
For more information, you can contact [Fidle team](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Fidle%20team) at : For more information, you can contact [Fidle team](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Fidle%20team) at :
[<img width="200px" style="vertical-align:middle" src="fidle/img/00-Mail_contact.svg"></img>](#top)\ [<img width="200px" style="vertical-align:middle" src="fidle/img/00-Mail_contact.svg"></img>](#top)\
Don't forget to look at the [Wiki](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/home) Don't forget to look at the [Wiki](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/home)
Current Version : <!-- VERSION_BEGIN --> Current Version : <!-- VERSION_BEGIN -->
**2.0.20** **2.0.20**
<!-- VERSION_END --> <!-- VERSION_END -->
## Course materials ## Course materials
| | | | | | | |
|:--:|:--:|:--:| |:--:|:--:|:--:|
| **[<img width="50px" src="fidle/img/00-Fidle-pdf.svg"></img><br>Course slides](https://cloud.univ-grenoble-alpes.fr/index.php/s/wxCztjYBbQ6zwd6)**<br>The course in pdf format<br>(12 Mo)| **[<img width="50px" src="fidle/img/00-Notebooks.svg"></img><br>Notebooks](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/archive/master/fidle-master.zip)**<br> &nbsp;&nbsp;&nbsp;&nbsp;Get a Zip or clone this repository &nbsp;&nbsp;&nbsp;&nbsp;<br>(40 Mo)| **[<img width="50px" src="fidle/img/00-Datasets-tar.svg"></img><br>Datasets](https://cloud.univ-grenoble-alpes.fr/index.php/s/wxCztjYBbQ6zwd6)**<br>All the needed datasets<br>(1.2 Go)| | **[<img width="50px" src="fidle/img/00-Fidle-pdf.svg"></img><br>Course slides](https://cloud.univ-grenoble-alpes.fr/index.php/s/wxCztjYBbQ6zwd6)**<br>The course in pdf format<br>(12 Mo)| **[<img width="50px" src="fidle/img/00-Notebooks.svg"></img><br>Notebooks](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/archive/master/fidle-master.zip)**<br> &nbsp;&nbsp;&nbsp;&nbsp;Get a Zip or clone this repository &nbsp;&nbsp;&nbsp;&nbsp;<br>(40 Mo)| **[<img width="50px" src="fidle/img/00-Datasets-tar.svg"></img><br>Datasets](https://cloud.univ-grenoble-alpes.fr/index.php/s/wxCztjYBbQ6zwd6)**<br>All the needed datasets<br>(1.2 Go)|
Have a look about **[How to get and install](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Using%20Fidle/install%20fidle)** these notebooks and datasets. Have a look about **[How to get and install](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Using%20Fidle/install%20fidle)** these notebooks and datasets.
## Jupyter notebooks ## Jupyter notebooks
<!-- INDEX_BEGIN --> <!-- INDEX_BEGIN -->
### Linear and logistic regression ### Linear and logistic regression
- **[LINR1](LinearReg/01-Linear-Regression.ipynb)** - [Linear regression with direct resolution](LinearReg/01-Linear-Regression.ipynb) - **[LINR1](LinearReg/01-Linear-Regression.ipynb)** - [Linear regression with direct resolution](LinearReg/01-Linear-Regression.ipynb)
Low-level implementation, using numpy, of a direct resolution for a linear regression Low-level implementation, using numpy, of a direct resolution for a linear regression
- **[GRAD1](LinearReg/02-Gradient-descent.ipynb)** - [Linear regression with gradient descent](LinearReg/02-Gradient-descent.ipynb) - **[GRAD1](LinearReg/02-Gradient-descent.ipynb)** - [Linear regression with gradient descent](LinearReg/02-Gradient-descent.ipynb)
Low level implementation of a solution by gradient descent. Basic and stochastic approach. Low level implementation of a solution by gradient descent. Basic and stochastic approach.
- **[POLR1](LinearReg/03-Polynomial-Regression.ipynb)** - [Complexity Syndrome](LinearReg/03-Polynomial-Regression.ipynb) - **[POLR1](LinearReg/03-Polynomial-Regression.ipynb)** - [Complexity Syndrome](LinearReg/03-Polynomial-Regression.ipynb)
Illustration of the problem of complexity with the polynomial regression Illustration of the problem of complexity with the polynomial regression
- **[LOGR1](LinearReg/04-Logistic-Regression.ipynb)** - [Logistic regression](LinearReg/04-Logistic-Regression.ipynb) - **[LOGR1](LinearReg/04-Logistic-Regression.ipynb)** - [Logistic regression](LinearReg/04-Logistic-Regression.ipynb)
Simple example of logistic regression with a sklearn solution Simple example of logistic regression with a sklearn solution
### Perceptron Model 1957 ### Perceptron Model 1957
- **[PER57](IRIS/01-Simple-Perceptron.ipynb)** - [Perceptron Model 1957](IRIS/01-Simple-Perceptron.ipynb) - **[PER57](IRIS/01-Simple-Perceptron.ipynb)** - [Perceptron Model 1957](IRIS/01-Simple-Perceptron.ipynb)
Example of use of a Perceptron, with sklearn and IRIS dataset of 1936 ! Example of use of a Perceptron, with sklearn and IRIS dataset of 1936 !
### Basic regression using DNN ### Basic regression using DNN
- **[BHPD1](BHPD/01-DNN-Regression.ipynb)** - [Regression with a Dense Network (DNN)](BHPD/01-DNN-Regression.ipynb) - **[BHPD1](BHPD/01-DNN-Regression.ipynb)** - [Regression with a Dense Network (DNN)](BHPD/01-DNN-Regression.ipynb)
Simple example of a regression with the dataset Boston Housing Prices Dataset (BHPD) Simple example of a regression with the dataset Boston Housing Prices Dataset (BHPD)
- **[BHPD2](BHPD/02-DNN-Regression-Premium.ipynb)** - [Regression with a Dense Network (DNN) - Advanced code](BHPD/02-DNN-Regression-Premium.ipynb) - **[BHPD2](BHPD/02-DNN-Regression-Premium.ipynb)** - [Regression with a Dense Network (DNN) - Advanced code](BHPD/02-DNN-Regression-Premium.ipynb)
A more advanced implementation of the precedent example A more advanced implementation of the precedent example
### Basic classification using a DNN ### Basic classification using a DNN
- **[MNIST1](MNIST/01-DNN-MNIST.ipynb)** - [Simple classification with DNN](MNIST/01-DNN-MNIST.ipynb) - **[MNIST1](MNIST/01-DNN-MNIST.ipynb)** - [Simple classification with DNN](MNIST/01-DNN-MNIST.ipynb)
An example of classification using a dense neural network for the famous MNIST dataset An example of classification using a dense neural network for the famous MNIST dataset
- **[MNIST2](MNIST/02-CNN-MNIST.ipynb)** - [Simple classification with CNN](MNIST/02-CNN-MNIST.ipynb) - **[MNIST2](MNIST/02-CNN-MNIST.ipynb)** - [Simple classification with CNN](MNIST/02-CNN-MNIST.ipynb)
An example of classification using a convolutional neural network for the famous MNIST dataset An example of classification using a convolutional neural network for the famous MNIST dataset
### Images classification with Convolutional Neural Networks (CNN) ### Images classification with Convolutional Neural Networks (CNN)
- **[GTSRB1](GTSRB/01-Preparation-of-data.ipynb)** - [Dataset analysis and preparation](GTSRB/01-Preparation-of-data.ipynb) - **[GTSRB1](GTSRB/01-Preparation-of-data.ipynb)** - [Dataset analysis and preparation](GTSRB/01-Preparation-of-data.ipynb)
Episode 1 : Analysis of the GTSRB dataset and creation of an enhanced dataset Episode 1 : Analysis of the GTSRB dataset and creation of an enhanced dataset
- **[GTSRB2](GTSRB/02-First-convolutions.ipynb)** - [First convolutions](GTSRB/02-First-convolutions.ipynb) - **[GTSRB2](GTSRB/02-First-convolutions.ipynb)** - [First convolutions](GTSRB/02-First-convolutions.ipynb)
Episode 2 : First convolutions and first classification of our traffic signs Episode 2 : First convolutions and first classification of our traffic signs
- **[GTSRB3](GTSRB/03-Tracking-and-visualizing.ipynb)** - [Training monitoring](GTSRB/03-Tracking-and-visualizing.ipynb) - **[GTSRB3](GTSRB/03-Tracking-and-visualizing.ipynb)** - [Training monitoring](GTSRB/03-Tracking-and-visualizing.ipynb)
Episode 3 : Monitoring, analysis and check points during a training session Episode 3 : Monitoring, analysis and check points during a training session
- **[GTSRB4](GTSRB/04-Data-augmentation.ipynb)** - [Data augmentation ](GTSRB/04-Data-augmentation.ipynb) - **[GTSRB4](GTSRB/04-Data-augmentation.ipynb)** - [Data augmentation ](GTSRB/04-Data-augmentation.ipynb)
Episode 4 : Adding data by data augmentation when we lack it, to improve our results Episode 4 : Adding data by data augmentation when we lack it, to improve our results
- **[GTSRB5](GTSRB/05-Full-convolutions.ipynb)** - [Full convolutions](GTSRB/05-Full-convolutions.ipynb) - **[GTSRB5](GTSRB/05-Full-convolutions.ipynb)** - [Full convolutions](GTSRB/05-Full-convolutions.ipynb)
Episode 5 : A lot of models, a lot of datasets and a lot of results. Episode 5 : A lot of models, a lot of datasets and a lot of results.
- **[GTSRB6](GTSRB/06-Notebook-as-a-batch.ipynb)** - [Full convolutions as a batch](GTSRB/06-Notebook-as-a-batch.ipynb) - **[GTSRB6](GTSRB/06-Notebook-as-a-batch.ipynb)** - [Full convolutions as a batch](GTSRB/06-Notebook-as-a-batch.ipynb)
Episode 6 : To compute bigger, use your notebook in batch mode Episode 6 : To compute bigger, use your notebook in batch mode
- **[GTSRB7](GTSRB/07-Show-report.ipynb)** - [Batch reports](GTSRB/07-Show-report.ipynb) - **[GTSRB7](GTSRB/07-Show-report.ipynb)** - [Batch reports](GTSRB/07-Show-report.ipynb)
Episode 7 : Displaying our jobs report, and the winner is... Episode 7 : Displaying our jobs report, and the winner is...
- **[GTSRB10](GTSRB/batch_oar.sh)** - [OAR batch script submission](GTSRB/batch_oar.sh) - **[GTSRB10](GTSRB/batch_oar.sh)** - [OAR batch script submission](GTSRB/batch_oar.sh)
Bash script for an OAR batch submission of an ipython code Bash script for an OAR batch submission of an ipython code
- **[GTSRB11](GTSRB/batch_slurm.sh)** - [SLURM batch script](GTSRB/batch_slurm.sh) - **[GTSRB11](GTSRB/batch_slurm.sh)** - [SLURM batch script](GTSRB/batch_slurm.sh)
Bash script for a Slurm batch submission of an ipython code Bash script for a Slurm batch submission of an ipython code
### Sentiment analysis with word embedding ### Sentiment analysis with word embedding
- **[IMDB1](IMDB/01-One-hot-encoding.ipynb)** - [Sentiment analysis with hot-one encoding](IMDB/01-One-hot-encoding.ipynb) - **[IMDB1](IMDB/01-One-hot-encoding.ipynb)** - [Sentiment analysis with hot-one encoding](IMDB/01-One-hot-encoding.ipynb)
A basic example of sentiment analysis with sparse encoding, using a dataset from Internet Movie Database (IMDB) A basic example of sentiment analysis with sparse encoding, using a dataset from Internet Movie Database (IMDB)
- **[IMDB2](IMDB/02-Keras-embedding.ipynb)** - [Sentiment analysis with text embedding](IMDB/02-Keras-embedding.ipynb) - **[IMDB2](IMDB/02-Keras-embedding.ipynb)** - [Sentiment analysis with text embedding](IMDB/02-Keras-embedding.ipynb)
A very classical example of word embedding with a dataset from Internet Movie Database (IMDB) A very classical example of word embedding with a dataset from Internet Movie Database (IMDB)
- **[IMDB3](IMDB/03-Prediction.ipynb)** - [Reload and reuse a saved model](IMDB/03-Prediction.ipynb) - **[IMDB3](IMDB/03-Prediction.ipynb)** - [Reload and reuse a saved model](IMDB/03-Prediction.ipynb)
Retrieving a saved model to perform a sentiment analysis (movie review) Retrieving a saved model to perform a sentiment analysis (movie review)
- **[IMDB4](IMDB/04-Show-vectors.ipynb)** - [Reload embedded vectors](IMDB/04-Show-vectors.ipynb) - **[IMDB4](IMDB/04-Show-vectors.ipynb)** - [Reload embedded vectors](IMDB/04-Show-vectors.ipynb)
Retrieving embedded vectors from our trained model Retrieving embedded vectors from our trained model
- **[IMDB5](IMDB/05-LSTM-Keras.ipynb)** - [Sentiment analysis with a RNN network](IMDB/05-LSTM-Keras.ipynb) - **[IMDB5](IMDB/05-LSTM-Keras.ipynb)** - [Sentiment analysis with a RNN network](IMDB/05-LSTM-Keras.ipynb)
Still the same problem, but with a network combining embedding and RNN Still the same problem, but with a network combining embedding and RNN
### Time series with Recurrent Neural Network (RNN) ### Time series with Recurrent Neural Network (RNN)
- **[LADYB1](SYNOP/LADYB1-Ladybug.ipynb)** - [Prediction of a 2D trajectory via RNN](SYNOP/LADYB1-Ladybug.ipynb) - **[LADYB1](SYNOP/LADYB1-Ladybug.ipynb)** - [Prediction of a 2D trajectory via RNN](SYNOP/LADYB1-Ladybug.ipynb)
Artificial dataset generation and prediction attempt via a recurrent network Artificial dataset generation and prediction attempt via a recurrent network
- **[SYNOP1](SYNOP/SYNOP1-Preparation-of-data.ipynb)** - [Preparation of data](SYNOP/SYNOP1-Preparation-of-data.ipynb) - **[SYNOP1](SYNOP/SYNOP1-Preparation-of-data.ipynb)** - [Preparation of data](SYNOP/SYNOP1-Preparation-of-data.ipynb)
Episode 1 : Data analysis and preparation of a usuable meteorological dataset (SYNOP) Episode 1 : Data analysis and preparation of a usuable meteorological dataset (SYNOP)
- **[SYNOP2](SYNOP/SYNOP2-First-predictions.ipynb)** - [First predictions at 3h](SYNOP/SYNOP2-First-predictions.ipynb) - **[SYNOP2](SYNOP/SYNOP2-First-predictions.ipynb)** - [First predictions at 3h](SYNOP/SYNOP2-First-predictions.ipynb)
Episode 2 : RNN training session for weather prediction attempt at 3h Episode 2 : RNN training session for weather prediction attempt at 3h
- **[SYNOP3](SYNOP/SYNOP3-12h-predictions.ipynb)** - [12h predictions](SYNOP/SYNOP3-12h-predictions.ipynb) - **[SYNOP3](SYNOP/SYNOP3-12h-predictions.ipynb)** - [12h predictions](SYNOP/SYNOP3-12h-predictions.ipynb)
Episode 3: Attempt to predict in a more longer term Episode 3: Attempt to predict in a more longer term
### Unsupervised learning with an autoencoder neural network (AE) ### Unsupervised learning with an autoencoder neural network (AE)
- **[AE1](AE/01-Prepare-MNIST-dataset.ipynb)** - [Prepare a noisy MNIST dataset](AE/01-Prepare-MNIST-dataset.ipynb) - **[AE1](AE/01-Prepare-MNIST-dataset.ipynb)** - [Prepare a noisy MNIST dataset](AE/01-Prepare-MNIST-dataset.ipynb)
Episode 1: Preparation of a noisy MNIST dataset Episode 1: Preparation of a noisy MNIST dataset
- **[AE2](AE/02-AE-with-MNIST.ipynb)** - [Building and training an AE denoiser model](AE/02-AE-with-MNIST.ipynb) - **[AE2](AE/02-AE-with-MNIST.ipynb)** - [Building and training an AE denoiser model](AE/02-AE-with-MNIST.ipynb)
Episode 1 : Construction of a denoising autoencoder and training of it with a noisy MNIST dataset. Episode 1 : Construction of a denoising autoencoder and training of it with a noisy MNIST dataset.
- **[AE3](AE/03-AE-with-MNIST-post.ipynb)** - [Playing with our denoiser model](AE/03-AE-with-MNIST-post.ipynb) - **[AE3](AE/03-AE-with-MNIST-post.ipynb)** - [Playing with our denoiser model](AE/03-AE-with-MNIST-post.ipynb)
Episode 2 : Using the previously trained autoencoder to denoise data Episode 2 : Using the previously trained autoencoder to denoise data
- **[AE4](AE/04-ExtAE-with-MNIST.ipynb)** - [Denoiser and classifier model](AE/04-ExtAE-with-MNIST.ipynb) - **[AE4](AE/04-ExtAE-with-MNIST.ipynb)** - [Denoiser and classifier model](AE/04-ExtAE-with-MNIST.ipynb)
Episode 4 : Construction of a denoiser and classifier model Episode 4 : Construction of a denoiser and classifier model
- **[AE5](AE/05-ExtAE-with-MNIST.ipynb)** - [Advanced denoiser and classifier model](AE/05-ExtAE-with-MNIST.ipynb) - **[AE5](AE/05-ExtAE-with-MNIST.ipynb)** - [Advanced denoiser and classifier model](AE/05-ExtAE-with-MNIST.ipynb)
Episode 5 : Construction of an advanced denoiser and classifier model Episode 5 : Construction of an advanced denoiser and classifier model
### Generative network with Variational Autoencoder (VAE) ### Generative network with Variational Autoencoder (VAE)
- **[VAE1](VAE/01-VAE-with-MNIST.ipynb)** - [First VAE, with a small dataset (MNIST)](VAE/01-VAE-with-MNIST.ipynb) - **[VAE1](VAE/01-VAE-with-MNIST.ipynb)** - [First VAE, with a small dataset (MNIST)](VAE/01-VAE-with-MNIST.ipynb)
Construction and training of a VAE with a latent space of small dimension. Construction and training of a VAE with a latent space of small dimension.
- **[VAE2](VAE/02-VAE-with-MNIST-post.ipynb)** - [Analysis of the associated latent space](VAE/02-VAE-with-MNIST-post.ipynb) - **[VAE2](VAE/02-VAE-with-MNIST-post.ipynb)** - [Analysis of the associated latent space](VAE/02-VAE-with-MNIST-post.ipynb)
Visualization and analysis of the VAE's latent space Visualization and analysis of the VAE's latent space
- **[VAE5](VAE/05-About-CelebA.ipynb)** - [Another game play : About the CelebA dataset](VAE/05-About-CelebA.ipynb) - **[VAE5](VAE/05-About-CelebA.ipynb)** - [Another game play : About the CelebA dataset](VAE/05-About-CelebA.ipynb)
Episode 1 : Presentation of the CelebA dataset and problems related to its size Episode 1 : Presentation of the CelebA dataset and problems related to its size
- **[VAE6](VAE/06-Prepare-CelebA-datasets.ipynb)** - [Generation of a clustered dataset](VAE/06-Prepare-CelebA-datasets.ipynb) - **[VAE6](VAE/06-Prepare-CelebA-datasets.ipynb)** - [Generation of a clustered dataset](VAE/06-Prepare-CelebA-datasets.ipynb)
Episode 2 : Analysis of the CelebA dataset and creation of an clustered and usable dataset Episode 2 : Analysis of the CelebA dataset and creation of an clustered and usable dataset
- **[VAE7](VAE/07-Check-CelebA.ipynb)** - [Checking the clustered dataset](VAE/07-Check-CelebA.ipynb) - **[VAE7](VAE/07-Check-CelebA.ipynb)** - [Checking the clustered dataset](VAE/07-Check-CelebA.ipynb)
Episode : 3 Clustered dataset verification and testing of our datagenerator Episode : 3 Clustered dataset verification and testing of our datagenerator
- **[VAE8](VAE/08-VAE-with-CelebA.ipynb)** - [Training session for our VAE](VAE/08-VAE-with-CelebA.ipynb) - **[VAE8](VAE/08-VAE-with-CelebA.ipynb)** - [Training session for our VAE](VAE/08-VAE-with-CelebA.ipynb)
Episode 4 : Training with our clustered datasets in notebook or batch mode Episode 4 : Training with our clustered datasets in notebook or batch mode
- **[VAE9](VAE/09-VAE-with-CelebA-post.ipynb)** - [Data generation from latent space](VAE/09-VAE-with-CelebA-post.ipynb) - **[VAE9](VAE/09-VAE-with-CelebA-post.ipynb)** - [Data generation from latent space](VAE/09-VAE-with-CelebA-post.ipynb)
Episode 5 : Exploring latent space to generate new data Episode 5 : Exploring latent space to generate new data
- **[VAE10](VAE/batch_slurm.sh)** - [SLURM batch script](VAE/batch_slurm.sh) - **[VAE10](VAE/batch_slurm.sh)** - [SLURM batch script](VAE/batch_slurm.sh)
Bash script for SLURM batch submission of VAE8 notebooks Bash script for SLURM batch submission of VAE8 notebooks
### Miscellaneous ### Miscellaneous
- **[ACTF1](Misc/Activation-Functions.ipynb)** - [Activation functions](Misc/Activation-Functions.ipynb) - **[ACTF1](Misc/Activation-Functions.ipynb)** - [Activation functions](Misc/Activation-Functions.ipynb)
Some activation functions, with their derivatives. Some activation functions, with their derivatives.
- **[NP1](Misc/Numpy.ipynb)** - [A short introduction to Numpy](Misc/Numpy.ipynb) - **[NP1](Misc/Numpy.ipynb)** - [A short introduction to Numpy](Misc/Numpy.ipynb)
Numpy is an essential tool for the Scientific Python. Numpy is an essential tool for the Scientific Python.
- **[SCRATCH1](Misc/Scratchbook.ipynb)** - [Scratchbook](Misc/Scratchbook.ipynb) - **[SCRATCH1](Misc/Scratchbook.ipynb)** - [Scratchbook](Misc/Scratchbook.ipynb)
A scratchbook for small examples A scratchbook for small examples
- **[TSB1](Misc/Using-Tensorboard.ipynb)** - [Tensorboard with/from Jupyter ](Misc/Using-Tensorboard.ipynb) - **[TSB1](Misc/Using-Tensorboard.ipynb)** - [Tensorboard with/from Jupyter ](Misc/Using-Tensorboard.ipynb)
4 ways to use Tensorboard from the Jupyter environment 4 ways to use Tensorboard from the Jupyter environment
<!-- INDEX_END --> <!-- INDEX_END -->
## Installation ## Installation
Have a look about **[How to get and install](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Using%20Fidle/install%20fidle)** these notebooks and datasets. Have a look about **[How to get and install](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Using%20Fidle/install%20fidle)** these notebooks and datasets.
## Licence ## Licence
[<img width="100px" src="fidle/img/00-fidle-CC BY-NC-SA.svg"></img>](https://creativecommons.org/licenses/by-nc-sa/4.0/) [<img width="100px" src="fidle/img/00-fidle-CC BY-NC-SA.svg"></img>](https://creativecommons.org/licenses/by-nc-sa/4.0/)
\[en\] Attribution - NonCommercial - ShareAlike 4.0 International (CC BY-NC-SA 4.0) \[en\] Attribution - NonCommercial - ShareAlike 4.0 International (CC BY-NC-SA 4.0)
\[Fr\] Attribution - Pas d’Utilisation Commerciale - Partage dans les Mêmes Conditions 4.0 International \[Fr\] Attribution - Pas d’Utilisation Commerciale - Partage dans les Mêmes Conditions 4.0 International
See [License](https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode). See [License](https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode).
See [Disclaimer](https://creativecommons.org/licenses/by-nc-sa/4.0/#). See [Disclaimer](https://creativecommons.org/licenses/by-nc-sa/4.0/#).
---- ----
[<img width="80px" src="fidle/img/00-Fidle-logo-01.svg"></img>](#top) [<img width="80px" src="fidle/img/00-Fidle-logo-01.svg"></img>](#top)
......
...@@ -19,9 +19,9 @@ https://www.youtube.com/channel/UC4Sukzudhbwr6fs10cXrJsQ ...@@ -19,9 +19,9 @@ https://www.youtube.com/channel/UC4Sukzudhbwr6fs10cXrJsQ
**- Prochain rendez-vous -** **- Prochain rendez-vous -**
|**Jeudi 18 mars, 14h, Séquence 5 : <br>Les réseaux autoencodeurs (AE), un exemple d'apprentissage non supervisé.**| |**Jeudi 25 mars, 14h, Séquence 6 : <br>Du réseau autoencodeur (AE) au Variational Autoencoder (VAE)<br>ou comment parcourir l'imagination de nos réseaux.**|
|--| |--|
|Principes et architecture des réseaux autoencodeurs.<br> Convolutions classiques et transposées - Espaces latents.<br> Programmation procédurale avec Keras - GPU et batch.<br>Exemple proposé :<br>Débruitage d'images fortement bruitées.<br>Où, comment à partir de ces image :<br> ![AE2-07-test-noisy-xs](./fidle/img/AE2-noisy.png)<br> Il est possible de retrouver celles-ci !<br>![AE2-08-test-predict-xs](./fidle/img/AE2-predict.png) | |Principes et architecture d'un Variational Autoencoder (VAE).<br>Problématiques liées à la gestion de "gros" datasets.<br>Projection gaussienne - Génération de données - Morphing dans l'espace latent.<br>Programmation avancée avec Keras - Datasets clusterisés.<br>Exemple proposé :<br>Mise en œuvre d'un VAE, génération de données et morphing dans l'espace latent.<br>![VAE](./fidle/img/VAE.jpg )|
|Durée : 2h - Les paramètres de diffusion seront [précisés la veille dans le wiki](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Fidle%20%C3%A0%20distance/En%20bref) | |Durée : 2h - Les paramètres de diffusion seront [précisés la veille dans le wiki](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Fidle%20%C3%A0%20distance/En%20bref) |
A propos de **[Fidle à distance](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Fidle%20%C3%A0%20distance/En%20bref)**\ A propos de **[Fidle à distance](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Fidle%20%C3%A0%20distance/En%20bref)**\
......
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
<img width="800px" src="../fidle/img/00-Fidle-header-01.svg"></img> <img width="800px" src="../fidle/img/00-Fidle-header-01.svg"></img>
## Mise a jour du catalog des notebooks et des READMEs ## Mise a jour du catalog des notebooks et des READMEs
- Génération du **catalog des notebooks** : [./logs/catalog.json](./logs/catalog.json) - Génération du **catalog des notebooks** : [./logs/catalog.json](./logs/catalog.json)
Ce fichier comporte une liste détaillée de tous les notebooks et scripts. Ce fichier comporte une liste détaillée de tous les notebooks et scripts.
- Génération automatique de la table des matières et mise à jour des **README** - Génération automatique de la table des matières et mise à jour des **README**
- [README.md](../README.md) - [README.md](../README.md)
- [README.ipynb](../README.ipynb) - [README.ipynb](../README.ipynb)
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 1 - Load modules and init ## Step 1 - Load modules and init
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
import nbformat import nbformat
from nbconvert.preprocessors import ExecutePreprocessor from nbconvert.preprocessors import ExecutePreprocessor
from IPython.display import display,Image,Markdown,HTML from IPython.display import display,Image,Markdown,HTML
import re import re
import sys, os, glob import sys, os, glob
import datetime, time import datetime, time
import json import json
from collections import OrderedDict from collections import OrderedDict
from importlib import reload from importlib import reload
sys.path.append('..') sys.path.append('..')
import fidle.pwk as pwk import fidle.pwk as pwk
import fidle.config as config import fidle.config as config
import fidle.cookindex as cookindex import fidle.cookindex as cookindex
``` ```
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 2 - List of folders containing notebooks to be indexed : ## Step 2 - List of folders containing notebooks to be indexed :
Order wil be index order Order wil be index order
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
directories_to_index = {'LinearReg':'Linear and logistic regression', directories_to_index = {'LinearReg':'Linear and logistic regression',
'IRIS':'Perceptron Model 1957', 'IRIS':'Perceptron Model 1957',
'BHPD':'Basic regression using DNN', 'BHPD':'Basic regression using DNN',
'MNIST':'Basic classification using a DNN', 'MNIST':'Basic classification using a DNN',
'GTSRB':'Images classification with Convolutional Neural Networks (CNN)', 'GTSRB':'Images classification with Convolutional Neural Networks (CNN)',
'IMDB':'Sentiment analysis with word embedding', 'IMDB':'Sentiment analysis with word embedding',
'SYNOP':'Time series with Recurrent Neural Network (RNN)', 'SYNOP':'Time series with Recurrent Neural Network (RNN)',
'AE':'Unsupervised learning with an autoencoder neural network (AE)', 'AE':'Unsupervised learning with an autoencoder neural network (AE)',
'VAE':'Generative network with Variational Autoencoder (VAE)', 'VAE':'Generative network with Variational Autoencoder (VAE)',
'Misc':'Miscellaneous' 'Misc':'Miscellaneous'
} }
``` ```
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 3 - Catalog of notebooks ## Step 3 - Catalog of notebooks
### 3.1 - Build catalog ### 3.1 - Build catalog
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
# ---- Get the notebook list # ---- Get the notebook list
# #
files_list = cookindex.get_files(directories_to_index.keys()) files_list = cookindex.get_files(directories_to_index.keys())
# ---- Get a detailled catalog for this list # ---- Get a detailled catalog for this list
# #
catalog = cookindex.get_catalog(files_list) catalog = cookindex.get_catalog(files_list)
with open(config.CATALOG_FILE,'wt') as fp: with open(config.CATALOG_FILE,'wt') as fp:
json.dump(catalog,fp,indent=4) json.dump(catalog,fp,indent=4)
print(f'Catalog saved as {config.CATALOG_FILE}') print(f'Catalog saved as {config.CATALOG_FILE}')
print('Entries : ',len(catalog)) print('Entries : ',len(catalog))
``` ```
%% Output %% Output
Read : LinearReg/01-Linear-Regression.ipynb Read : LinearReg/01-Linear-Regression.ipynb
Read : LinearReg/02-Gradient-descent.ipynb Read : LinearReg/02-Gradient-descent.ipynb
Read : LinearReg/03-Polynomial-Regression.ipynb Read : LinearReg/03-Polynomial-Regression.ipynb
Read : LinearReg/04-Logistic-Regression.ipynb Read : LinearReg/04-Logistic-Regression.ipynb
Read : IRIS/01-Simple-Perceptron.ipynb Read : IRIS/01-Simple-Perceptron.ipynb
Read : BHPD/01-DNN-Regression.ipynb Read : BHPD/01-DNN-Regression.ipynb
Read : BHPD/02-DNN-Regression-Premium.ipynb Read : BHPD/02-DNN-Regression-Premium.ipynb
Read : MNIST/01-DNN-MNIST.ipynb Read : MNIST/01-DNN-MNIST.ipynb
Read : MNIST/02-CNN-MNIST.ipynb Read : MNIST/02-CNN-MNIST.ipynb
Read : GTSRB/01-Preparation-of-data.ipynb Read : GTSRB/01-Preparation-of-data.ipynb
Read : GTSRB/02-First-convolutions.ipynb Read : GTSRB/02-First-convolutions.ipynb
Read : GTSRB/03-Tracking-and-visualizing.ipynb Read : GTSRB/03-Tracking-and-visualizing.ipynb
Read : GTSRB/04-Data-augmentation.ipynb Read : GTSRB/04-Data-augmentation.ipynb
Read : GTSRB/05-Full-convolutions.ipynb Read : GTSRB/05-Full-convolutions.ipynb
Read : GTSRB/06-Notebook-as-a-batch.ipynb Read : GTSRB/06-Notebook-as-a-batch.ipynb
Read : GTSRB/07-Show-report.ipynb Read : GTSRB/07-Show-report.ipynb
Read : IMDB/01-One-hot-encoding.ipynb Read : IMDB/01-One-hot-encoding.ipynb
Read : IMDB/02-Keras-embedding.ipynb Read : IMDB/02-Keras-embedding.ipynb
Read : IMDB/03-Prediction.ipynb Read : IMDB/03-Prediction.ipynb
Read : IMDB/04-Show-vectors.ipynb Read : IMDB/04-Show-vectors.ipynb
Read : IMDB/05-LSTM-Keras.ipynb Read : IMDB/05-LSTM-Keras.ipynb
Read : SYNOP/LADYB1-Ladybug.ipynb Read : SYNOP/LADYB1-Ladybug.ipynb
Read : SYNOP/SYNOP1-Preparation-of-data.ipynb Read : SYNOP/SYNOP1-Preparation-of-data.ipynb
Read : SYNOP/SYNOP2-First-predictions.ipynb Read : SYNOP/SYNOP2-First-predictions.ipynb
Read : SYNOP/SYNOP3-12h-predictions.ipynb Read : SYNOP/SYNOP3-12h-predictions.ipynb
Read : AE/01-Prepare-MNIST-dataset.ipynb Read : AE/01-Prepare-MNIST-dataset.ipynb
Read : AE/02-AE-with-MNIST.ipynb Read : AE/02-AE-with-MNIST.ipynb
Read : AE/03-AE-with-MNIST-post.ipynb Read : AE/03-AE-with-MNIST-post.ipynb
Read : AE/04-ExtAE-with-MNIST.ipynb Read : AE/04-ExtAE-with-MNIST.ipynb
Read : AE/05-ExtAE-with-MNIST.ipynb Read : AE/05-ExtAE-with-MNIST.ipynb
Read : VAE/01-VAE-with-MNIST.ipynb Read : VAE/01-VAE-with-MNIST.ipynb
Read : VAE/02-VAE-with-MNIST-post.ipynb Read : VAE/02-VAE-with-MNIST-post.ipynb
Read : VAE/05-About-CelebA.ipynb Read : VAE/05-About-CelebA.ipynb
Read : VAE/06-Prepare-CelebA-datasets.ipynb Read : VAE/06-Prepare-CelebA-datasets.ipynb
Read : VAE/07-Check-CelebA.ipynb Read : VAE/07-Check-CelebA.ipynb
Read : VAE/08-VAE-with-CelebA.ipynb Read : VAE/08-VAE-with-CelebA.ipynb
Read : VAE/09-VAE-with-CelebA-post.ipynb Read : VAE/09-VAE-with-CelebA-post.ipynb
Read : Misc/Activation-Functions.ipynb Read : Misc/Activation-Functions.ipynb
Read : Misc/Numpy.ipynb Read : Misc/Numpy.ipynb
Read : Misc/Scratchbook.ipynb Read : Misc/Scratchbook.ipynb
Read : Misc/Using-Tensorboard.ipynb Read : Misc/Using-Tensorboard.ipynb
Catalog saved as ../fidle/logs/catalog.json Catalog saved as ../fidle/logs/catalog.json
Entries : 44 Entries : 44
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
### 3.2 build index ### 3.2 build index
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
styles = open('css/readme.css', "r").read() styles = open('css/readme.css', "r").read()
lines_md=[] lines_md=[]
lines_html=[styles] lines_html=[styles]
for directory,title in directories_to_index.items(): for directory,title in directories_to_index.items():
lines_md.append(f'\n### {title}') lines_md.append(f'\n### {title}')
lines_html.append( f'<div class="fid_section">{title}</div>') lines_html.append( f'<div class="fid_section">{title}</div>')
entries = { k:v for k,v in catalog.items() if v['dirname']==directory } entries = { k:v for k,v in catalog.items() if v['dirname']==directory }
for id, about in entries.items(): for id, about in entries.items():
id = about['id'] id = about['id']
dirname = about['dirname'] dirname = about['dirname']
basename = about['basename'] basename = about['basename']
title = about['title'] title = about['title']
description = about['description'] description = about['description']
link=f'{dirname}/{basename}'.replace(' ','%20') link=f'{dirname}/{basename}'.replace(' ','%20')
md = f'- **[{id}]({link})** - [{title}]({link}) \n' md = f'- **[{id}]({link})** - [{title}]({link}) \n'
md += f'{description}' md += f'{description}'
html = f"""<div class="fid_line"> html = f"""<div class="fid_line">
<span class="fid_id"> <span class="fid_id">
<a href="{link}">{id}</a> <a href="{link}">{id}</a>
</span> <a href="{link}">{title}</a><br> </span> <a href="{link}">{title}</a><br>
<span class="fid_desc">{description}</span> <span class="fid_desc">{description}</span>
</div> </div>
""" """
lines_md.append(md) lines_md.append(md)
lines_html.append(html) lines_html.append(html)
index_md = '\n'.join(lines_md) index_md = '\n'.join(lines_md)
index_html = '\n'.join(lines_html) index_html = '\n'.join(lines_html)
display(Markdown('**Index is :**')) display(Markdown('**Index is :**'))
display(Markdown(index_md)) display(Markdown(index_md))
# display(HTML(index_html)) # display(HTML(index_html))
``` ```
%% Output %% Output
**Index is :** **Index is :**
### Linear and logistic regression ### Linear and logistic regression
- **[LINR1](LinearReg/01-Linear-Regression.ipynb)** - [Linear regression with direct resolution](LinearReg/01-Linear-Regression.ipynb) - **[LINR1](LinearReg/01-Linear-Regression.ipynb)** - [Linear regression with direct resolution](LinearReg/01-Linear-Regression.ipynb)
Low-level implementation, using numpy, of a direct resolution for a linear regression Low-level implementation, using numpy, of a direct resolution for a linear regression
- **[GRAD1](LinearReg/02-Gradient-descent.ipynb)** - [Linear regression with gradient descent](LinearReg/02-Gradient-descent.ipynb) - **[GRAD1](LinearReg/02-Gradient-descent.ipynb)** - [Linear regression with gradient descent](LinearReg/02-Gradient-descent.ipynb)
Low level implementation of a solution by gradient descent. Basic and stochastic approach. Low level implementation of a solution by gradient descent. Basic and stochastic approach.
- **[POLR1](LinearReg/03-Polynomial-Regression.ipynb)** - [Complexity Syndrome](LinearReg/03-Polynomial-Regression.ipynb) - **[POLR1](LinearReg/03-Polynomial-Regression.ipynb)** - [Complexity Syndrome](LinearReg/03-Polynomial-Regression.ipynb)
Illustration of the problem of complexity with the polynomial regression Illustration of the problem of complexity with the polynomial regression
- **[LOGR1](LinearReg/04-Logistic-Regression.ipynb)** - [Logistic regression](LinearReg/04-Logistic-Regression.ipynb) - **[LOGR1](LinearReg/04-Logistic-Regression.ipynb)** - [Logistic regression](LinearReg/04-Logistic-Regression.ipynb)
Simple example of logistic regression with a sklearn solution Simple example of logistic regression with a sklearn solution
### Perceptron Model 1957 ### Perceptron Model 1957
- **[PER57](IRIS/01-Simple-Perceptron.ipynb)** - [Perceptron Model 1957](IRIS/01-Simple-Perceptron.ipynb) - **[PER57](IRIS/01-Simple-Perceptron.ipynb)** - [Perceptron Model 1957](IRIS/01-Simple-Perceptron.ipynb)
Example of use of a Perceptron, with sklearn and IRIS dataset of 1936 ! Example of use of a Perceptron, with sklearn and IRIS dataset of 1936 !
### Basic regression using DNN ### Basic regression using DNN
- **[BHPD1](BHPD/01-DNN-Regression.ipynb)** - [Regression with a Dense Network (DNN)](BHPD/01-DNN-Regression.ipynb) - **[BHPD1](BHPD/01-DNN-Regression.ipynb)** - [Regression with a Dense Network (DNN)](BHPD/01-DNN-Regression.ipynb)
Simple example of a regression with the dataset Boston Housing Prices Dataset (BHPD) Simple example of a regression with the dataset Boston Housing Prices Dataset (BHPD)
- **[BHPD2](BHPD/02-DNN-Regression-Premium.ipynb)** - [Regression with a Dense Network (DNN) - Advanced code](BHPD/02-DNN-Regression-Premium.ipynb) - **[BHPD2](BHPD/02-DNN-Regression-Premium.ipynb)** - [Regression with a Dense Network (DNN) - Advanced code](BHPD/02-DNN-Regression-Premium.ipynb)
A more advanced implementation of the precedent example A more advanced implementation of the precedent example
### Basic classification using a DNN ### Basic classification using a DNN
- **[MNIST1](MNIST/01-DNN-MNIST.ipynb)** - [Simple classification with DNN](MNIST/01-DNN-MNIST.ipynb) - **[MNIST1](MNIST/01-DNN-MNIST.ipynb)** - [Simple classification with DNN](MNIST/01-DNN-MNIST.ipynb)
An example of classification using a dense neural network for the famous MNIST dataset An example of classification using a dense neural network for the famous MNIST dataset
- **[MNIST2](MNIST/02-CNN-MNIST.ipynb)** - [Simple classification with CNN](MNIST/02-CNN-MNIST.ipynb) - **[MNIST2](MNIST/02-CNN-MNIST.ipynb)** - [Simple classification with CNN](MNIST/02-CNN-MNIST.ipynb)
An example of classification using a convolutional neural network for the famous MNIST dataset An example of classification using a convolutional neural network for the famous MNIST dataset
### Images classification with Convolutional Neural Networks (CNN) ### Images classification with Convolutional Neural Networks (CNN)
- **[GTSRB1](GTSRB/01-Preparation-of-data.ipynb)** - [Dataset analysis and preparation](GTSRB/01-Preparation-of-data.ipynb) - **[GTSRB1](GTSRB/01-Preparation-of-data.ipynb)** - [Dataset analysis and preparation](GTSRB/01-Preparation-of-data.ipynb)
Episode 1 : Analysis of the GTSRB dataset and creation of an enhanced dataset Episode 1 : Analysis of the GTSRB dataset and creation of an enhanced dataset
- **[GTSRB2](GTSRB/02-First-convolutions.ipynb)** - [First convolutions](GTSRB/02-First-convolutions.ipynb) - **[GTSRB2](GTSRB/02-First-convolutions.ipynb)** - [First convolutions](GTSRB/02-First-convolutions.ipynb)
Episode 2 : First convolutions and first classification of our traffic signs Episode 2 : First convolutions and first classification of our traffic signs
- **[GTSRB3](GTSRB/03-Tracking-and-visualizing.ipynb)** - [Training monitoring](GTSRB/03-Tracking-and-visualizing.ipynb) - **[GTSRB3](GTSRB/03-Tracking-and-visualizing.ipynb)** - [Training monitoring](GTSRB/03-Tracking-and-visualizing.ipynb)
Episode 3 : Monitoring, analysis and check points during a training session Episode 3 : Monitoring, analysis and check points during a training session
- **[GTSRB4](GTSRB/04-Data-augmentation.ipynb)** - [Data augmentation ](GTSRB/04-Data-augmentation.ipynb) - **[GTSRB4](GTSRB/04-Data-augmentation.ipynb)** - [Data augmentation ](GTSRB/04-Data-augmentation.ipynb)
Episode 4 : Adding data by data augmentation when we lack it, to improve our results Episode 4 : Adding data by data augmentation when we lack it, to improve our results
- **[GTSRB5](GTSRB/05-Full-convolutions.ipynb)** - [Full convolutions](GTSRB/05-Full-convolutions.ipynb) - **[GTSRB5](GTSRB/05-Full-convolutions.ipynb)** - [Full convolutions](GTSRB/05-Full-convolutions.ipynb)
Episode 5 : A lot of models, a lot of datasets and a lot of results. Episode 5 : A lot of models, a lot of datasets and a lot of results.
- **[GTSRB6](GTSRB/06-Notebook-as-a-batch.ipynb)** - [Full convolutions as a batch](GTSRB/06-Notebook-as-a-batch.ipynb) - **[GTSRB6](GTSRB/06-Notebook-as-a-batch.ipynb)** - [Full convolutions as a batch](GTSRB/06-Notebook-as-a-batch.ipynb)
Episode 6 : To compute bigger, use your notebook in batch mode Episode 6 : To compute bigger, use your notebook in batch mode
- **[GTSRB7](GTSRB/07-Show-report.ipynb)** - [Batch reports](GTSRB/07-Show-report.ipynb) - **[GTSRB7](GTSRB/07-Show-report.ipynb)** - [Batch reports](GTSRB/07-Show-report.ipynb)
Episode 7 : Displaying our jobs report, and the winner is... Episode 7 : Displaying our jobs report, and the winner is...
- **[GTSRB10](GTSRB/batch_oar.sh)** - [OAR batch script submission](GTSRB/batch_oar.sh) - **[GTSRB10](GTSRB/batch_oar.sh)** - [OAR batch script submission](GTSRB/batch_oar.sh)
Bash script for an OAR batch submission of an ipython code Bash script for an OAR batch submission of an ipython code
- **[GTSRB11](GTSRB/batch_slurm.sh)** - [SLURM batch script](GTSRB/batch_slurm.sh) - **[GTSRB11](GTSRB/batch_slurm.sh)** - [SLURM batch script](GTSRB/batch_slurm.sh)
Bash script for a Slurm batch submission of an ipython code Bash script for a Slurm batch submission of an ipython code
### Sentiment analysis with word embedding ### Sentiment analysis with word embedding
- **[IMDB1](IMDB/01-One-hot-encoding.ipynb)** - [Sentiment analysis with hot-one encoding](IMDB/01-One-hot-encoding.ipynb) - **[IMDB1](IMDB/01-One-hot-encoding.ipynb)** - [Sentiment analysis with hot-one encoding](IMDB/01-One-hot-encoding.ipynb)
A basic example of sentiment analysis with sparse encoding, using a dataset from Internet Movie Database (IMDB) A basic example of sentiment analysis with sparse encoding, using a dataset from Internet Movie Database (IMDB)
- **[IMDB2](IMDB/02-Keras-embedding.ipynb)** - [Sentiment analysis with text embedding](IMDB/02-Keras-embedding.ipynb) - **[IMDB2](IMDB/02-Keras-embedding.ipynb)** - [Sentiment analysis with text embedding](IMDB/02-Keras-embedding.ipynb)
A very classical example of word embedding with a dataset from Internet Movie Database (IMDB) A very classical example of word embedding with a dataset from Internet Movie Database (IMDB)
- **[IMDB3](IMDB/03-Prediction.ipynb)** - [Reload and reuse a saved model](IMDB/03-Prediction.ipynb) - **[IMDB3](IMDB/03-Prediction.ipynb)** - [Reload and reuse a saved model](IMDB/03-Prediction.ipynb)
Retrieving a saved model to perform a sentiment analysis (movie review) Retrieving a saved model to perform a sentiment analysis (movie review)
- **[IMDB4](IMDB/04-Show-vectors.ipynb)** - [Reload embedded vectors](IMDB/04-Show-vectors.ipynb) - **[IMDB4](IMDB/04-Show-vectors.ipynb)** - [Reload embedded vectors](IMDB/04-Show-vectors.ipynb)
Retrieving embedded vectors from our trained model Retrieving embedded vectors from our trained model
- **[IMDB5](IMDB/05-LSTM-Keras.ipynb)** - [Sentiment analysis with a RNN network](IMDB/05-LSTM-Keras.ipynb) - **[IMDB5](IMDB/05-LSTM-Keras.ipynb)** - [Sentiment analysis with a RNN network](IMDB/05-LSTM-Keras.ipynb)
Still the same problem, but with a network combining embedding and RNN Still the same problem, but with a network combining embedding and RNN
### Time series with Recurrent Neural Network (RNN) ### Time series with Recurrent Neural Network (RNN)
- **[LADYB1](SYNOP/LADYB1-Ladybug.ipynb)** - [Prediction of a 2D trajectory via RNN](SYNOP/LADYB1-Ladybug.ipynb) - **[LADYB1](SYNOP/LADYB1-Ladybug.ipynb)** - [Prediction of a 2D trajectory via RNN](SYNOP/LADYB1-Ladybug.ipynb)
Artificial dataset generation and prediction attempt via a recurrent network Artificial dataset generation and prediction attempt via a recurrent network
- **[SYNOP1](SYNOP/SYNOP1-Preparation-of-data.ipynb)** - [Preparation of data](SYNOP/SYNOP1-Preparation-of-data.ipynb) - **[SYNOP1](SYNOP/SYNOP1-Preparation-of-data.ipynb)** - [Preparation of data](SYNOP/SYNOP1-Preparation-of-data.ipynb)
Episode 1 : Data analysis and preparation of a usuable meteorological dataset (SYNOP) Episode 1 : Data analysis and preparation of a usuable meteorological dataset (SYNOP)
- **[SYNOP2](SYNOP/SYNOP2-First-predictions.ipynb)** - [First predictions at 3h](SYNOP/SYNOP2-First-predictions.ipynb) - **[SYNOP2](SYNOP/SYNOP2-First-predictions.ipynb)** - [First predictions at 3h](SYNOP/SYNOP2-First-predictions.ipynb)
Episode 2 : RNN training session for weather prediction attempt at 3h Episode 2 : RNN training session for weather prediction attempt at 3h
- **[SYNOP3](SYNOP/SYNOP3-12h-predictions.ipynb)** - [12h predictions](SYNOP/SYNOP3-12h-predictions.ipynb) - **[SYNOP3](SYNOP/SYNOP3-12h-predictions.ipynb)** - [12h predictions](SYNOP/SYNOP3-12h-predictions.ipynb)
Episode 3: Attempt to predict in a more longer term Episode 3: Attempt to predict in a more longer term
### Unsupervised learning with an autoencoder neural network (AE) ### Unsupervised learning with an autoencoder neural network (AE)
- **[AE1](AE/01-Prepare-MNIST-dataset.ipynb)** - [Prepare a noisy MNIST dataset](AE/01-Prepare-MNIST-dataset.ipynb) - **[AE1](AE/01-Prepare-MNIST-dataset.ipynb)** - [Prepare a noisy MNIST dataset](AE/01-Prepare-MNIST-dataset.ipynb)
Episode 1: Preparation of a noisy MNIST dataset Episode 1: Preparation of a noisy MNIST dataset
- **[AE2](AE/02-AE-with-MNIST.ipynb)** - [Building and training an AE denoiser model](AE/02-AE-with-MNIST.ipynb) - **[AE2](AE/02-AE-with-MNIST.ipynb)** - [Building and training an AE denoiser model](AE/02-AE-with-MNIST.ipynb)
Episode 1 : Construction of a denoising autoencoder and training of it with a noisy MNIST dataset. Episode 1 : Construction of a denoising autoencoder and training of it with a noisy MNIST dataset.
- **[AE3](AE/03-AE-with-MNIST-post.ipynb)** - [Playing with our denoiser model](AE/03-AE-with-MNIST-post.ipynb) - **[AE3](AE/03-AE-with-MNIST-post.ipynb)** - [Playing with our denoiser model](AE/03-AE-with-MNIST-post.ipynb)
Episode 2 : Using the previously trained autoencoder to denoise data Episode 2 : Using the previously trained autoencoder to denoise data
- **[AE4](AE/04-ExtAE-with-MNIST.ipynb)** - [Denoiser and classifier model](AE/04-ExtAE-with-MNIST.ipynb) - **[AE4](AE/04-ExtAE-with-MNIST.ipynb)** - [Denoiser and classifier model](AE/04-ExtAE-with-MNIST.ipynb)
Episode 4 : Construction of a denoiser and classifier model Episode 4 : Construction of a denoiser and classifier model
- **[AE5](AE/05-ExtAE-with-MNIST.ipynb)** - [Advanced denoiser and classifier model](AE/05-ExtAE-with-MNIST.ipynb) - **[AE5](AE/05-ExtAE-with-MNIST.ipynb)** - [Advanced denoiser and classifier model](AE/05-ExtAE-with-MNIST.ipynb)
Episode 5 : Construction of an advanced denoiser and classifier model Episode 5 : Construction of an advanced denoiser and classifier model
### Generative network with Variational Autoencoder (VAE) ### Generative network with Variational Autoencoder (VAE)
- **[VAE1](VAE/01-VAE-with-MNIST.ipynb)** - [First VAE, with a small dataset (MNIST)](VAE/01-VAE-with-MNIST.ipynb) - **[VAE1](VAE/01-VAE-with-MNIST.ipynb)** - [First VAE, with a small dataset (MNIST)](VAE/01-VAE-with-MNIST.ipynb)
Construction and training of a VAE with a latent space of small dimension. Construction and training of a VAE with a latent space of small dimension.
- **[VAE2](VAE/02-VAE-with-MNIST-post.ipynb)** - [Analysis of the associated latent space](VAE/02-VAE-with-MNIST-post.ipynb) - **[VAE2](VAE/02-VAE-with-MNIST-post.ipynb)** - [Analysis of the associated latent space](VAE/02-VAE-with-MNIST-post.ipynb)
Visualization and analysis of the VAE's latent space Visualization and analysis of the VAE's latent space
- **[VAE5](VAE/05-About-CelebA.ipynb)** - [Another game play : About the CelebA dataset](VAE/05-About-CelebA.ipynb) - **[VAE5](VAE/05-About-CelebA.ipynb)** - [Another game play : About the CelebA dataset](VAE/05-About-CelebA.ipynb)
Episode 1 : Presentation of the CelebA dataset and problems related to its size Episode 1 : Presentation of the CelebA dataset and problems related to its size
- **[VAE6](VAE/06-Prepare-CelebA-datasets.ipynb)** - [Generation of a clustered dataset](VAE/06-Prepare-CelebA-datasets.ipynb) - **[VAE6](VAE/06-Prepare-CelebA-datasets.ipynb)** - [Generation of a clustered dataset](VAE/06-Prepare-CelebA-datasets.ipynb)
Episode 2 : Analysis of the CelebA dataset and creation of an clustered and usable dataset Episode 2 : Analysis of the CelebA dataset and creation of an clustered and usable dataset
- **[VAE7](VAE/07-Check-CelebA.ipynb)** - [Checking the clustered dataset](VAE/07-Check-CelebA.ipynb) - **[VAE7](VAE/07-Check-CelebA.ipynb)** - [Checking the clustered dataset](VAE/07-Check-CelebA.ipynb)
Episode : 3 Clustered dataset verification and testing of our datagenerator Episode : 3 Clustered dataset verification and testing of our datagenerator
- **[VAE8](VAE/08-VAE-with-CelebA.ipynb)** - [Training session for our VAE](VAE/08-VAE-with-CelebA.ipynb) - **[VAE8](VAE/08-VAE-with-CelebA.ipynb)** - [Training session for our VAE](VAE/08-VAE-with-CelebA.ipynb)
Episode 4 : Training with our clustered datasets in notebook or batch mode Episode 4 : Training with our clustered datasets in notebook or batch mode
- **[VAE9](VAE/09-VAE-with-CelebA-post.ipynb)** - [Data generation from latent space](VAE/09-VAE-with-CelebA-post.ipynb) - **[VAE9](VAE/09-VAE-with-CelebA-post.ipynb)** - [Data generation from latent space](VAE/09-VAE-with-CelebA-post.ipynb)
Episode 5 : Exploring latent space to generate new data Episode 5 : Exploring latent space to generate new data
- **[VAE10](VAE/batch_slurm.sh)** - [SLURM batch script](VAE/batch_slurm.sh) - **[VAE10](VAE/batch_slurm.sh)** - [SLURM batch script](VAE/batch_slurm.sh)
Bash script for SLURM batch submission of VAE8 notebooks Bash script for SLURM batch submission of VAE8 notebooks
### Miscellaneous ### Miscellaneous
- **[ACTF1](Misc/Activation-Functions.ipynb)** - [Activation functions](Misc/Activation-Functions.ipynb) - **[ACTF1](Misc/Activation-Functions.ipynb)** - [Activation functions](Misc/Activation-Functions.ipynb)
Some activation functions, with their derivatives. Some activation functions, with their derivatives.
- **[NP1](Misc/Numpy.ipynb)** - [A short introduction to Numpy](Misc/Numpy.ipynb) - **[NP1](Misc/Numpy.ipynb)** - [A short introduction to Numpy](Misc/Numpy.ipynb)
Numpy is an essential tool for the Scientific Python. Numpy is an essential tool for the Scientific Python.
- **[SCRATCH1](Misc/Scratchbook.ipynb)** - [Scratchbook](Misc/Scratchbook.ipynb) - **[SCRATCH1](Misc/Scratchbook.ipynb)** - [Scratchbook](Misc/Scratchbook.ipynb)
A scratchbook for small examples A scratchbook for small examples
- **[TSB1](Misc/Using-Tensorboard.ipynb)** - [Tensorboard with/from Jupyter ](Misc/Using-Tensorboard.ipynb) - **[TSB1](Misc/Using-Tensorboard.ipynb)** - [Tensorboard with/from Jupyter ](Misc/Using-Tensorboard.ipynb)
4 ways to use Tensorboard from the Jupyter environment 4 ways to use Tensorboard from the Jupyter environment
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 4 - Update README.md ## Step 4 - Update README.md
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
# ---- Load README.md # ---- Load README.md
# #
with open('../README.md','r') as fp: with open('../README.md','r') as fp:
readme=fp.read() readme=fp.read()
# ---- Update index, version # ---- Update index, version
# #
readme = cookindex.tag('INDEX', index_md, readme) readme = cookindex.tag('INDEX', index_md, readme)
readme = cookindex.tag('VERSION', f'**{config.VERSION}**', readme) readme = cookindex.tag('VERSION', f'**{config.VERSION}**', readme)
# ---- Save it # ---- Save it
# #
with open('../README.md','wt') as fp: with open('../README.md','wt') as fp:
fp.write(readme) fp.write(readme)
print('README.md is updated.') print('README.md is updated.')
``` ```
%% Output %% Output
README.md is updated. README.md is updated.
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 5 - README.ipynb ## Step 5 - README.ipynb
Just execute README.ipynb Just execute README.ipynb
%% Cell type:raw id: tags: %% Cell type:raw id: tags:
# ---- Load notebook # ---- Load notebook
# #
notebook = nbformat.read('../README.ipynb', nbformat.NO_CONVERT) notebook = nbformat.read('../README.ipynb', nbformat.NO_CONVERT)
# new_cell = nbformat.v4.new_markdown_cell(source=readme) # new_cell = nbformat.v4.new_markdown_cell(source=readme)
# notebook.cells.append(new_cell) # notebook.cells.append(new_cell)
# ---- Execute it # ---- Execute it
# #
ep = ExecutePreprocessor(timeout=600, kernel_name="python3") ep = ExecutePreprocessor(timeout=600, kernel_name="python3")
ep.preprocess(notebook, {'metadata': {'path': '..'}}) ep.preprocess(notebook, {'metadata': {'path': '..'}})
# ---- Save it # ---- Save it
with open('../READMEv2.ipynb', mode="w", encoding='utf-8') as fp: with open('../READMEv2.ipynb', mode="w", encoding='utf-8') as fp:
nbformat.write(notebook) nbformat.write(notebook)
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 6 - More fun : Create and execute it :-) ## Step 6 - More fun : Create and execute it :-)
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
Plus rigolo, on va fabriquer le README.ipynb et l'executer :-) Plus rigolo, on va fabriquer le README.ipynb et l'executer :-)
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
# ---- Create Notebook from scratch # ---- Create Notebook from scratch
# #
notebook = nbformat.v4.new_notebook() notebook = nbformat.v4.new_notebook()
# ---- Add a code cell # ---- Add a code cell
# #
code = "from IPython.display import display,Markdown\n" code = "from IPython.display import display,Markdown\n"
code+= "display(Markdown(open('README.md', 'r').read()))\n" code+= "display(Markdown(open('README.md', 'r').read()))\n"
code+= "#\n" code+= "#\n"
code+= "# This README is visible under Jupiter LAb ! :-)" code+= "# This README is visible under Jupiter LAb ! :-)"
new_cell = nbformat.v4.new_code_cell(source=code) new_cell = nbformat.v4.new_code_cell(source=code)
new_cell['metadata']= { "jupyter": { "source_hidden": True} } new_cell['metadata']= { "jupyter": { "source_hidden": True} }
notebook.cells.append(new_cell) notebook.cells.append(new_cell)
# --- Pour éviter une modification lors de l'ouverture du notebook # --- Pour éviter une modification lors de l'ouverture du notebook
# pas génante, mais nécessite de resauvegarder le document à la fermeture... # pas génante, mais nécessite de resauvegarder le document à la fermeture...
notebook['metadata']["kernelspec"] = {"display_name": "Python 3", "language": "python", "name": "python3" } notebook['metadata']["kernelspec"] = {"display_name": "Python 3", "language": "python", "name": "python3" }
# ---- Run it # ---- Run it
# #
ep = ExecutePreprocessor(timeout=600, kernel_name="python3") ep = ExecutePreprocessor(timeout=600, kernel_name="python3")
ep.preprocess(notebook, {'metadata': {'path': '..'}}) ep.preprocess(notebook, {'metadata': {'path': '..'}})
# ---- Save it # ---- Save it
# #
with open('../README.ipynb', mode="w", encoding='utf-8') as fp: with open('../README.ipynb', mode="w", encoding='utf-8') as fp:
nbformat.write(notebook, fp) nbformat.write(notebook, fp)
print('README.ipynb built and saved') print('README.ipynb built and saved')
``` ```
%% Output %% Output
README.ipynb built and saved README.ipynb built and saved
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
now = datetime.datetime.now() now = datetime.datetime.now()
print('Completed on : ', now.strftime("%A %d %B %Y, %H:%M:%S")) print('Completed on : ', now.strftime("%A %d %B %Y, %H:%M:%S"))
``` ```
%% Output %% Output
Completed on : Sunday 14 March 2021, 21:44:10 Completed on : Friday 19 March 2021, 15:01:12
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
--- ---
<img width="80px" src="../fidle/img/00-Fidle-logo-01.svg"></img> <img width="80px" src="../fidle/img/00-Fidle-logo-01.svg"></img>
......
fidle/img/VAE.jpg

19.4 KiB

0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment