Newer
Older
"id": "1f828036",
"iopub.execute_input": "2024-01-21T16:21:09.860108Z",
"iopub.status.busy": "2024-01-21T16:21:09.859792Z",
"iopub.status.idle": "2024-01-21T16:21:09.870962Z",
"shell.execute_reply": "2024-01-21T16:21:09.870075Z"
"jupyter": {
"source_hidden": true
}
},
"outputs": [
{
"data": {
"text/markdown": [
"<a name=\"top\"></a>\n",
"\n",
"[<img width=\"600px\" src=\"fidle/img/title.svg\"></img>](#top)\n",
"<!-- --------------------------------------------------- -->\n",
"<!-- To correctly view this README under Jupyter Lab -->\n",
"<!-- Open the notebook: README.ipynb! -->\n",
"<!-- --------------------------------------------------- -->\n",
"\n",
"This repository contains all the documents and links of the **Fidle Training** . \n",
"Fidle (for Formation Introduction au Deep Learning) is a 3-day training session co-organized \n",
"by the 3IA MIAI institute, the CNRS, via the Mission for Transversal and Interdisciplinary \n",
"Initiatives (MITI) and the University of Grenoble Alpes (UGA). \n",
"The objectives of this training are :\n",
" - Understanding the **bases of Deep Learning** neural networks\n",
" - Develop a **first experience** through simple and representative examples\n",
" - Understanding **Tensorflow/Keras** and **Jupyter lab** technologies\n",
" - Apprehend the **academic computing environments** Tier-2 or Tier-1 with powerfull GPU\n",
"\n",
"For more information, see **https://fidle.cnrs.fr** :\n",
"- **[Presentation of the training](https://fidle.cnrs.fr/presentation)**\n",
"- **[Detailed program](https://fidle.cnrs.fr/programme)**\n",
"- [Subscribe to the list](https://fidle.cnrs.fr/listeinfo), to stay informed !\n",
"- [Find us on youtube](https://fidle.cnrs.fr/youtube)\n",
"- [Corrected notebooks](https://fidle.cnrs.fr/done)\n",
"[<img width=\"200px\" style=\"vertical-align:middle\" src=\"fidle/img/00-Mail_contact.svg\"></img>](#top)\n",
"Current Version : <!-- VERSION_BEGIN -->2.5.4<!-- VERSION_END -->\n",
"| **[<img width=\"50px\" src=\"fidle/img/00-Fidle-pdf.svg\"></img><br>Course slides](https://fidle.cnrs.fr/supports)**<br>The course in pdf format<br>| **[<img width=\"50px\" src=\"fidle/img/00-Notebooks.svg\"></img><br>Notebooks](https://fidle.cnrs.fr/notebooks)**<br> Get a Zip or clone this repository <br>| **[<img width=\"50px\" src=\"fidle/img/00-Datasets-tar.svg\"></img><br>Datasets](https://fidle.cnrs.fr/datasets-fidle.tar)**<br>All the needed datasets<br>|**[<img width=\"50px\" src=\"fidle/img/00-Videos.svg\"></img><br>Videos](https://fidle.cnrs.fr/youtube)**<br> Our Youtube channel <br> |\n",
"Have a look about **[How to get and install](https://fidle.cnrs.fr/installation)** these notebooks and datasets.\n",
"\n",
"\n",
"## Jupyter notebooks\n",
"\n",
"**NOTE :** The examples marked **\"obsolete\"** are still functional under Keras2/Tensorflow, \n",
"but cannot be run in the proposed environment, now based on Keras3, PyTorch and Lightning. \n",
"We have decided to consider Keras2/Tensorflow as pedagogically obsolete, although Keras2 and Tensorflow are still perfectly usable (January 2024). \n",
"For these reason, they are kept as examples, while we develop the Keras3/PyTorch versions. \n",
"The world of Deep Learning is changing very fast !\n",
"\n",
"<!-- Automatically generated on : 21/01/24 17:21:08 -->\n",
"### Linear and logistic regression\n",
"- **[LINR1](LinearReg/01-Linear-Regression.ipynb)** - [Linear regression with direct resolution](LinearReg/01-Linear-Regression.ipynb) \n",
"Low-level implementation, using numpy, of a direct resolution for a linear regression\n",
"- **[GRAD1](LinearReg/02-Gradient-descent.ipynb)** - [Linear regression with gradient descent](LinearReg/02-Gradient-descent.ipynb) \n",
"Low level implementation of a solution by gradient descent. Basic and stochastic approach.\n",
"- **[POLR1](LinearReg/03-Polynomial-Regression.ipynb)** - [Complexity Syndrome](LinearReg/03-Polynomial-Regression.ipynb) \n",
"Illustration of the problem of complexity with the polynomial regression\n",
"- **[LOGR1](LinearReg/04-Logistic-Regression.ipynb)** - [Logistic regression](LinearReg/04-Logistic-Regression.ipynb) \n",
"Simple example of logistic regression with a sklearn solution\n",
"\n",
"### Perceptron Model 1957\n",
"- **[PER57](Perceptron/01-Simple-Perceptron.ipynb)** - [Perceptron Model 1957](Perceptron/01-Simple-Perceptron.ipynb) \n",
"Example of use of a Perceptron, with sklearn and IRIS dataset of 1936 !\n",
"\n",
"### BHPD regression (DNN), using Keras3/PyTorch\n",
"- **[K3BHPD1](BHPD.Keras3/01-DNN-Regression.ipynb)** - [Regression with a Dense Network (DNN)](BHPD.Keras3/01-DNN-Regression.ipynb) \n",
"Simple example of a regression with the dataset Boston Housing Prices Dataset (BHPD)\n",
"- **[K3BHPD2](BHPD.Keras3/02-DNN-Regression-Premium.ipynb)** - [Regression with a Dense Network (DNN) - Advanced code](BHPD.Keras3/02-DNN-Regression-Premium.ipynb) \n",
"A more advanced implementation of the precedent example, using Keras3\n",
"\n",
"### BHPD regression (DNN), using PyTorch\n",
"- **[PBHPD1](BHPD.PyTorch/01-DNN-Regression_PyTorch.ipynb)** - [Regression with a Dense Network (DNN)](BHPD.PyTorch/01-DNN-Regression_PyTorch.ipynb) \n",
"A Simple regression with a Dense Neural Network (DNN) using Pytorch - BHPD dataset\n",
"\n",
"### Wine Quality prediction (DNN), using Keras3/PyTorch\n",
"- **[K3WINE1](Wine.Keras3/01-DNN-Wine-Regression.ipynb)** - [Wine quality prediction with a Dense Network (DNN)](Wine.Keras3/01-DNN-Wine-Regression.ipynb) \n",
"Another example of regression, with a wine quality prediction, using Keras 3 and PyTorch\n",
"\n",
"### Wine Quality prediction (DNN), using PyTorch/Lightning\n",
"- **[LWINE1](Wine.Lightning/01-DNN-Wine-Regression-lightning.ipynb)** - [Wine quality prediction with a Dense Network (DNN)](Wine.Lightning/01-DNN-Wine-Regression-lightning.ipynb) \n",
"Another example of regression, with a wine quality prediction, using PyTorch Lightning\n",
"### MNIST classification (DNN,CNN), using Keras3/PyTorch\n",
"- **[K3MNIST1](MNIST.Keras3/01-DNN-MNIST.ipynb)** - [Simple classification with DNN](MNIST.Keras3/01-DNN-MNIST.ipynb) \n",
"An example of classification using a dense neural network for the famous MNIST dataset\n",
"- **[K3MNIST2](MNIST.Keras3/02-CNN-MNIST.ipynb)** - [Simple classification with CNN](MNIST.Keras3/02-CNN-MNIST.ipynb) \n",
"An example of classification using a convolutional neural network for the famous MNIST dataset\n",
"\n",
"### MNIST classification (DNN,CNN), using PyTorch\n",
"- **[PMNIST1](MNIST.PyTorch/01-DNN-MNIST_PyTorch.ipynb)** - [Simple classification with DNN](MNIST.PyTorch/01-DNN-MNIST_PyTorch.ipynb) \n",
"Example of classification with a fully connected neural network, using Pytorch\n",
"\n",
"### MNIST classification (DNN,CNN), using PyTorch/Lightning\n",
"- **[LMNIST2](MNIST.Lightning/01-DNN-MNIST_Lightning.ipynb)** - [Simple classification with DNN](MNIST.Lightning/01-DNN-MNIST_Lightning.ipynb) \n",
"An example of classification using a dense neural network for the famous MNIST dataset, using PyTorch Lightning\n",
"- **[LMNIST2](MNIST.Lightning/02-CNN-MNIST_Lightning.ipynb)** - [Simple classification with CNN](MNIST.Lightning/02-CNN-MNIST_Lightning.ipynb) \n",
"An example of classification using a convolutional neural network for the famous MNIST dataset, using PyTorch Lightning\n",
"### Images classification GTSRB with Convolutional Neural Networks (CNN), using Keras3/PyTorch\n",
"- **[K3GTSRB1](GTSRB.Keras3/01-Preparation-of-data.ipynb)** - [Dataset analysis and preparation](GTSRB.Keras3/01-Preparation-of-data.ipynb) \n",
"Episode 1 : Analysis of the GTSRB dataset and creation of an enhanced dataset\n",
"- **[K3GTSRB2](GTSRB.Keras3/02-First-convolutions.ipynb)** - [First convolutions](GTSRB.Keras3/02-First-convolutions.ipynb) \n",
"Episode 2 : First convolutions and first classification of our traffic signs, using Keras3\n",
"- **[K3GTSRB3](GTSRB.Keras3/03-Better-convolutions.ipynb)** - [Training monitoring](GTSRB.Keras3/03-Better-convolutions.ipynb) \n",
"Episode 3 : Monitoring, analysis and check points during a training session, using Keras3\n",
"- **[K3GTSRB4](GTSRB.Keras3/04-Keras-cv.ipynb)** - [Hight level example (Keras-cv)](GTSRB.Keras3/04-Keras-cv.ipynb) \n",
"An example of using a pre-trained model with Keras-cv\n",
"- **[K3GTSRB10](GTSRB.Keras3/batch_oar.sh)** - [OAR batch script submission](GTSRB.Keras3/batch_oar.sh) \n",
"Bash script for an OAR batch submission of an ipython code\n",
"- **[K3GTSRB11](GTSRB.Keras3/batch_slurm.sh)** - [SLURM batch script](GTSRB.Keras3/batch_slurm.sh) \n",
"Bash script for a Slurm batch submission of an ipython code\n",
"### Sentiment analysis with word embedding, using Keras3/PyTorch\n",
"- **[K3IMDB1](Embedding.Keras3/01-One-hot-encoding.ipynb)** - [Sentiment analysis with hot-one encoding](Embedding.Keras3/01-One-hot-encoding.ipynb) \n",
"A basic example of sentiment analysis with sparse encoding, using a dataset from Internet Movie Database (IMDB), using Keras 3 on PyTorch\n",
"- **[K3IMDB2](Embedding.Keras3/02-Keras-embedding.ipynb)** - [Sentiment analysis with text embedding](Embedding.Keras3/02-Keras-embedding.ipynb) \n",
"A very classical example of word embedding with a dataset from Internet Movie Database (IMDB), using Keras 3 on PyTorch\n",
"- **[K3IMDB3](Embedding.Keras3/03-Prediction.ipynb)** - [Reload and reuse a saved model](Embedding.Keras3/03-Prediction.ipynb) \n",
"Retrieving a saved model to perform a sentiment analysis (movie review), using Keras 3 and PyTorch\n",
"- **[K3IMDB4](Embedding.Keras3/04-Show-vectors.ipynb)** - [Reload embedded vectors](Embedding.Keras3/04-Show-vectors.ipynb) \n",
"Retrieving embedded vectors from our trained model, using Keras 3 and PyTorch\n",
"- **[K3IMDB5](Embedding.Keras3/05-LSTM-Keras.ipynb)** - [Sentiment analysis with a RNN network](Embedding.Keras3/05-LSTM-Keras.ipynb) \n",
"Still the same problem, but with a network combining embedding and RNN, using Keras 3 and PyTorch\n",
"\n",
"### Time series with Recurrent Neural Network (RNN), using Keras3/PyTorch\n",
"- **[K3LADYB1](RNN.Keras3/01-Ladybug.ipynb)** - [Prediction of a 2D trajectory via RNN](RNN.Keras3/01-Ladybug.ipynb) \n",
"Artificial dataset generation and prediction attempt via a recurrent network, using Keras 3 and PyTorch\n",
"\n",
"### Sentiment analysis with transformer, using PyTorch\n",
"- **[TRANS1](Transformers.PyTorch/01-Distilbert.ipynb)** - [IMDB, Sentiment analysis with Transformers ](Transformers.PyTorch/01-Distilbert.ipynb) \n",
"Using a Tranformer to perform a sentiment analysis (IMDB) - Jean Zay version\n",
"- **[TRANS2](Transformers.PyTorch/02-distilbert_colab.ipynb)** - [IMDB, Sentiment analysis with Transformers ](Transformers.PyTorch/02-distilbert_colab.ipynb) \n",
"Using a Tranformer to perform a sentiment analysis (IMDB) - Colab version\n",
"\n",
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
"### Unsupervised learning with an autoencoder neural network (AE), using Keras2 (obsolete)\n",
"- **[K2AE1](AE.Keras2/01-Prepare-MNIST-dataset.ipynb)** - [Prepare a noisy MNIST dataset](AE.Keras2/01-Prepare-MNIST-dataset.ipynb) \n",
"Episode 1: Preparation of a noisy MNIST dataset, using Keras 2 and Tensorflow (obsolete)\n",
"- **[K2AE2](AE.Keras2/02-AE-with-MNIST.ipynb)** - [Building and training an AE denoiser model](AE.Keras2/02-AE-with-MNIST.ipynb) \n",
"Episode 1 : Construction of a denoising autoencoder and training of it with a noisy MNIST dataset, using Keras 2 and Tensorflow (obsolete)\n",
"- **[K2AE3](AE.Keras2/03-AE-with-MNIST-post.ipynb)** - [Playing with our denoiser model](AE.Keras2/03-AE-with-MNIST-post.ipynb) \n",
"Episode 2 : Using the previously trained autoencoder to denoise data, using Keras 2 and Tensorflow (obsolete)\n",
"- **[K2AE4](AE.Keras2/04-ExtAE-with-MNIST.ipynb)** - [Denoiser and classifier model](AE.Keras2/04-ExtAE-with-MNIST.ipynb) \n",
"Episode 4 : Construction of a denoiser and classifier model, using Keras 2 and Tensorflow (obsolete)\n",
"- **[K2AE5](AE.Keras2/05-ExtAE-with-MNIST.ipynb)** - [Advanced denoiser and classifier model](AE.Keras2/05-ExtAE-with-MNIST.ipynb) \n",
"Episode 5 : Construction of an advanced denoiser and classifier model, using Keras 2 and Tensorflow (obsolete)\n",
"\n",
"### Generative network with Variational Autoencoder (VAE), using Keras2 (obsolete)\n",
"- **[K2VAE1](VAE.Keras2/01-VAE-with-MNIST.ipynb)** - [First VAE, using functional API (MNIST dataset)](VAE.Keras2/01-VAE-with-MNIST.ipynb) \n",
"Construction and training of a VAE, using functional APPI, with a latent space of small dimension, using Keras 2 and Tensorflow (obsolete)\n",
"- **[K2VAE2](VAE.Keras2/02-VAE-with-MNIST.ipynb)** - [VAE, using a custom model class (MNIST dataset)](VAE.Keras2/02-VAE-with-MNIST.ipynb) \n",
"Construction and training of a VAE, using model subclass, with a latent space of small dimension, using Keras 2 and Tensorflow (obsolete)\n",
"- **[K2VAE3](VAE.Keras2/03-VAE-with-MNIST-post.ipynb)** - [Analysis of the VAE's latent space of MNIST dataset](VAE.Keras2/03-VAE-with-MNIST-post.ipynb) \n",
"Visualization and analysis of the VAE's latent space of the dataset MNIST, using Keras 2 and Tensorflow (obsolete)\n",
"\n",
"### Generative network with Variational Autoencoder (VAE), using PyTorch Lightning\n",
"- **[LVAE1](VAE.Lightning/01-VAE-lightning-with-MNIST.ipynb)** - [First VAE, using Lightning API (MNIST dataset)](VAE.Lightning/01-VAE-lightning-with-MNIST.ipynb) \n",
"Construction and training of a VAE, using Lightning API, with a latent space of small dimension, using PyTorch Lightning\n",
"- **[LVAE2](VAE.Lightning/02-VAE-with-Lightning-MNIST.ipynb)** - [VAE, using a custom model class (MNIST dataset)](VAE.Lightning/02-VAE-with-Lightning-MNIST.ipynb) \n",
"Construction and training of a VAE, using model subclass, with a latent space of small dimension, using PyTorch Lightninh\n",
"- **[LVAE3](VAE.Lightning/03-VAE-Lightning-with-MNIST-post.ipynb)** - [Analysis of the VAE's latent space of MNIST dataset](VAE.Lightning/03-VAE-Lightning-with-MNIST-post.ipynb) \n",
"Visualization and analysis of the VAE's latent space of the dataset MNIST, using PyTorch Lightning\n",
"\n",
"### Generative Adversarial Networks (GANs), using Lightning\n",
"- **[LSHEEP3](DCGAN.Lightning/01-DCGAN-PL.ipynb)** - [A DCGAN to Draw a Sheep, using Pytorch Lightning](DCGAN.Lightning/01-DCGAN-PL.ipynb) \n",
"\"Draw me a sheep\", revisited with a DCGAN, using Pytorch Lightning\n",
"### Diffusion Model (DDPM) using PyTorch\n",
"- **[DDPM1](DDPM.PyTorch/01-ddpm.ipynb)** - [Fashion MNIST Generation with DDPM](DDPM.PyTorch/01-ddpm.ipynb) \n",
"Diffusion Model example, to generate Fashion MNIST images.\n",
"- **[DDPM2](DDPM.PyTorch/model.py)** - [DDPM Python classes](DDPM.PyTorch/model.py) \n",
"Python classes used by DDMP Example\n",
"\n",
"### Training optimization, using PyTorch\n",
"- **[OPT1](Optimization.PyTorch/01-Apprentissages-rapides-et-Optimisations.ipynb)** - [Training setup optimization](Optimization.PyTorch/01-Apprentissages-rapides-et-Optimisations.ipynb) \n",
"The goal of this notebook is to go through a typical deep learning model training\n",
"\n",
"### Deep Reinforcement Learning (DRL), using PyTorch\n",
"- **[DRL1](DRL.PyTorch/FIDLE_DQNfromScratch.ipynb)** - [Solving CartPole with DQN](DRL.PyTorch/FIDLE_DQNfromScratch.ipynb) \n",
"Using a a Deep Q-Network to play CartPole - an inverted pendulum problem (PyTorch)\n",
"- **[DRL2](DRL.PyTorch/FIDLE_rl_baselines_zoo.ipynb)** - [RL Baselines3 Zoo: Training in Colab](DRL.PyTorch/FIDLE_rl_baselines_zoo.ipynb) \n",
"Demo of Stable baseline3 with Colab\n",
"\n",
"### Miscellaneous things, but very important!\n",
"- **[NP1](Misc/00-Numpy.ipynb)** - [A short introduction to Numpy](Misc/00-Numpy.ipynb) \n",
"Numpy is an essential tool for the Scientific Python.\n",
"- **[ACTF1](Misc/01-Activation-Functions.ipynb)** - [Activation functions](Misc/01-Activation-Functions.ipynb) \n",
"Some activation functions, with their derivatives.\n",
"- **[PANDAS1](Misc/02-Using-pandas.ipynb)** - [Quelques exemples avec Pandas](Misc/02-Using-pandas.ipynb) \n",
"pandas is another essential tool for the Scientific Python.\n",
"- **[PYTORCH1](Misc/03-Using-Pytorch.ipynb)** - [Practical Lab : PyTorch](Misc/03-Using-Pytorch.ipynb) \n",
"PyTorch est l'un des principaux framework utilisé dans le Deep Learning\n",
"- **[TSB1](Misc/04-Using-Tensorboard.ipynb)** - [Tensorboard with/from Jupyter ](Misc/04-Using-Tensorboard.ipynb) \n",
"4 ways to use Tensorboard from the Jupyter environment\n",
"- **[SCRATCH1](Misc/99-Scratchbook.ipynb)** - [Scratchbook](Misc/99-Scratchbook.ipynb) \n",
"A scratchbook for small examples\n",
"**NOTE :** The examples marked **\"obsolete\"** are still functional under Keras2/Tensorflow, \n",
"but cannot be run in the proposed environment, now based on Keras3, PyTorch and Lightning. \n",
"We have decided to consider Keras2/Tensorflow as pedagogically obsolete, although Keras2 and Tensorflow are still perfectly usable (January 2024). \n",
"For these resaon, they are kept as examples, while we develop the Keras3/PyTorch versions. \n",
"The world of Deep Learning is changing very fast !\n",
"Have a look about **[How to get and install](https://fidle.cnrs.fr/installation)** these notebooks and datasets.\n",
"\n",
"## Licence\n",
"\n",
"[<img width=\"100px\" src=\"fidle/img/00-fidle-CC BY-NC-SA.svg\"></img>](https://creativecommons.org/licenses/by-nc-sa/4.0/) \n",
"\\[en\\] Attribution - NonCommercial - ShareAlike 4.0 International (CC BY-NC-SA 4.0) \n",
"\\[Fr\\] Attribution - Pas d’Utilisation Commerciale - Partage dans les Mêmes Conditions 4.0 International \n",
"See [License](https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode). \n",
"See [Disclaimer](https://creativecommons.org/licenses/by-nc-sa/4.0/#). \n",
"\n",
"\n",
"----\n",
"[<img width=\"80px\" src=\"fidle/img/logo-paysage.svg\"></img>](#top)\n"
],
"text/plain": [
"<IPython.core.display.Markdown object>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"from IPython.display import display,Markdown\n",
"display(Markdown(open('README.md', 'r').read()))\n",
"#\n",
"# This README is visible under Jupiter Lab ;-)# Automatically generated on : 21/01/24 17:21:08"
]
}
],
"metadata": {
"kernelspec": {
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",