Newer
Older
"id": "173d25d2",
"iopub.execute_input": "2023-11-06T14:13:54.813170Z",
"iopub.status.busy": "2023-11-06T14:13:54.812440Z",
"iopub.status.idle": "2023-11-06T14:13:54.823230Z",
"shell.execute_reply": "2023-11-06T14:13:54.822337Z"
"jupyter": {
"source_hidden": true
}
},
"outputs": [
{
"data": {
"text/markdown": [
"<a name=\"top\"></a>\n",
"\n",
"[<img width=\"600px\" src=\"fidle/img/title.svg\"></img>](#top)\n",
"<!-- --------------------------------------------------- -->\n",
"<!-- To correctly view this README under Jupyter Lab -->\n",
"<!-- Open the notebook: README.ipynb! -->\n",
"<!-- --------------------------------------------------- -->\n",
"\n",
"This repository contains all the documents and links of the **Fidle Training** . \n",
"Fidle (for Formation Introduction au Deep Learning) is a 3-day training session co-organized \n",
"by the 3IA MIAI institute, the CNRS, via the Mission for Transversal and Interdisciplinary \n",
"Initiatives (MITI) and the University of Grenoble Alpes (UGA). \n",
"The objectives of this training are :\n",
" - Understanding the **bases of Deep Learning** neural networks\n",
" - Develop a **first experience** through simple and representative examples\n",
" - Understanding **Tensorflow/Keras** and **Jupyter lab** technologies\n",
" - Apprehend the **academic computing environments** Tier-2 or Tier-1 with powerfull GPU\n",
"\n",
"For more information, see **https://fidle.cnrs.fr** :\n",
"- **[Presentation of the training](https://fidle.cnrs.fr/presentation)**\n",
"- **[Detailed program](https://fidle.cnrs.fr/programme)**\n",
"- [Subscribe to the list](https://fidle.cnrs.fr/listeinfo), to stay informed !\n",
"- [Find us on youtube](https://fidle.cnrs.fr/youtube)\n",
"- [Corrected notebooks](https://fidle.cnrs.fr/done)\n",
"[<img width=\"200px\" style=\"vertical-align:middle\" src=\"fidle/img/00-Mail_contact.svg\"></img>](#top)\n",
"Current Version : <!-- VERSION_BEGIN -->2.4.1<!-- VERSION_END -->\n",
"| **[<img width=\"50px\" src=\"fidle/img/00-Fidle-pdf.svg\"></img><br>Course slides](https://fidle.cnrs.fr/supports)**<br>The course in pdf format<br>| **[<img width=\"50px\" src=\"fidle/img/00-Notebooks.svg\"></img><br>Notebooks](https://fidle.cnrs.fr/notebooks)**<br> Get a Zip or clone this repository <br>| **[<img width=\"50px\" src=\"fidle/img/00-Datasets-tar.svg\"></img><br>Datasets](https://fidle.cnrs.fr/datasets-fidle.tar)**<br>All the needed datasets<br>|**[<img width=\"50px\" src=\"fidle/img/00-Videos.svg\"></img><br>Videos](https://fidle.cnrs.fr/youtube)**<br> Our Youtube channel <br> |\n",
"Have a look about **[How to get and install](https://fidle.cnrs.fr/installation)** these notebooks and datasets.\n",
"\n",
"\n",
"## Jupyter notebooks\n",
"\n",
"<!-- Automatically generated on : 06/11/23 15:13:53 -->\n",
"### Linear and logistic regression\n",
"- **[LINR1](LinearReg/01-Linear-Regression.ipynb)** - [Linear regression with direct resolution](LinearReg/01-Linear-Regression.ipynb) \n",
"Low-level implementation, using numpy, of a direct resolution for a linear regression\n",
"- **[GRAD1](LinearReg/02-Gradient-descent.ipynb)** - [Linear regression with gradient descent](LinearReg/02-Gradient-descent.ipynb) \n",
"Low level implementation of a solution by gradient descent. Basic and stochastic approach.\n",
"- **[POLR1](LinearReg/03-Polynomial-Regression.ipynb)** - [Complexity Syndrome](LinearReg/03-Polynomial-Regression.ipynb) \n",
"Illustration of the problem of complexity with the polynomial regression\n",
"- **[LOGR1](LinearReg/04-Logistic-Regression.ipynb)** - [Logistic regression](LinearReg/04-Logistic-Regression.ipynb) \n",
"Simple example of logistic regression with a sklearn solution\n",
"\n",
"### Perceptron Model 1957\n",
"- **[PER57](Perceptron/01-Simple-Perceptron.ipynb)** - [Perceptron Model 1957](Perceptron/01-Simple-Perceptron.ipynb) \n",
"Example of use of a Perceptron, with sklearn and IRIS dataset of 1936 !\n",
"\n",
"### BHPD regression (DNN), using Keras\n",
"- **[KBHPD1](BHPD.Keras/01-DNN-Regression.ipynb)** - [Regression with a Dense Network (DNN)](BHPD.Keras/01-DNN-Regression.ipynb) \n",
"Simple example of a regression with the dataset Boston Housing Prices Dataset (BHPD)\n",
"- **[KBHPD2](BHPD.Keras/02-DNN-Regression-Premium.ipynb)** - [Regression with a Dense Network (DNN) - Advanced code](BHPD.Keras/02-DNN-Regression-Premium.ipynb) \n",
"A more advanced implementation of the precedent example\n",
"\n",
"### BHPD regression (DNN), using PyTorch\n",
"- **[PBHPD1](BHPD.PyTorch/01-DNN-Regression_PyTorch.ipynb)** - [Regression with a Dense Network (DNN)](BHPD.PyTorch/01-DNN-Regression_PyTorch.ipynb) \n",
"A Simple regression with a Dense Neural Network (DNN) using Pytorch - BHPD dataset\n",
"\n",
"### Wine Quality prediction (DNN), using Keras\n",
"- **[KWINE1](Wine.Keras/01-DNN-Wine-Regression.ipynb)** - [Wine quality prediction with a Dense Network (DNN)](Wine.Keras/01-DNN-Wine-Regression.ipynb) \n",
"Another example of regression, with a wine quality prediction!\n",
"\n",
"### Wine Quality prediction (DNN), using PyTorch\n",
"- **[WINE1](Wine.Lightning/01-DNN-Wine-Regression-lightning.ipynb)** - [Wine quality prediction with a Dense Network (DNN) using Lightning](Wine.Lightning/01-DNN-Wine-Regression-lightning.ipynb) \n",
"Another example of regression, with a wine quality prediction!\n",
"### MNIST classification (DNN,CNN), using Keras\n",
"- **[KMNIST1](MNIST.Keras/01-DNN-MNIST.ipynb)** - [Simple classification with DNN](MNIST.Keras/01-DNN-MNIST.ipynb) \n",
"An example of classification using a dense neural network for the famous MNIST dataset\n",
"- **[KMNIST2](MNIST.Keras/02-CNN-MNIST.ipynb)** - [Simple classification with CNN](MNIST.Keras/02-CNN-MNIST.ipynb) \n",
"An example of classification using a convolutional neural network for the famous MNIST dataset\n",
"\n",
"### MNIST classification (DNN,CNN), using PyTorch\n",
"- **[PMNIST1](MNIST.PyTorch/01-DNN-MNIST_PyTorch.ipynb)** - [Simple classification with DNN](MNIST.PyTorch/01-DNN-MNIST_PyTorch.ipynb) \n",
"Example of classification with a fully connected neural network, using Pytorch\n",
"\n",
"### MNIST classification (DNN,CNN), using Lightning\n",
"- **[MNIST2](MNIST.Lightning/01-DNN-MNIST_Lightning.ipynb)** - [Simple classification with DNN using pytorch lightning](MNIST.Lightning/01-DNN-MNIST_Lightning.ipynb) \n",
"An example of classification using a dense neural network for the famous MNIST dataset\n",
"- **[MNIST2](MNIST.Lightning/02-CNN-MNIST_Lightning.ipynb)** - [Simple classification with CNN using lightning](MNIST.Lightning/02-CNN-MNIST_Lightning.ipynb) \n",
"An example of classification using a convolutional neural network for the famous MNIST dataset\n",
"### Images classification with Convolutional Neural Networks (CNN)\n",
"- **[GTSRB1](GTSRB/01-Preparation-of-data.ipynb)** - [Dataset analysis and preparation](GTSRB/01-Preparation-of-data.ipynb) \n",
"Episode 1 : Analysis of the GTSRB dataset and creation of an enhanced dataset\n",
"- **[GTSRB2](GTSRB/02-First-convolutions.ipynb)** - [First convolutions](GTSRB/02-First-convolutions.ipynb) \n",
"Episode 2 : First convolutions and first classification of our traffic signs\n",
"- **[GTSRB3](GTSRB/03-Tracking-and-visualizing.ipynb)** - [Training monitoring](GTSRB/03-Tracking-and-visualizing.ipynb) \n",
"Episode 3 : Monitoring, analysis and check points during a training session\n",
"- **[GTSRB4](GTSRB/04-Data-augmentation.ipynb)** - [Data augmentation ](GTSRB/04-Data-augmentation.ipynb) \n",
"Episode 4 : Adding data by data augmentation when we lack it, to improve our results\n",
"- **[GTSRB5](GTSRB/05-Full-convolutions.ipynb)** - [Full convolutions](GTSRB/05-Full-convolutions.ipynb) \n",
"Episode 5 : A lot of models, a lot of datasets and a lot of results.\n",
"- **[GTSRB6](GTSRB/06-Notebook-as-a-batch.ipynb)** - [Full convolutions as a batch](GTSRB/06-Notebook-as-a-batch.ipynb) \n",
"Episode 6 : To compute bigger, use your notebook in batch mode\n",
"- **[GTSRB7](GTSRB/07-Show-report.ipynb)** - [Batch reports](GTSRB/07-Show-report.ipynb) \n",
"Episode 7 : Displaying our jobs report, and the winner is...\n",
"- **[GTSRB10](GTSRB/batch_oar.sh)** - [OAR batch script submission](GTSRB/batch_oar.sh) \n",
"Bash script for an OAR batch submission of an ipython code\n",
"- **[GTSRB11](GTSRB/batch_slurm.sh)** - [SLURM batch script](GTSRB/batch_slurm.sh) \n",
"Bash script for a Slurm batch submission of an ipython code\n",
"### Sentiment analysis with word embedding\n",
"- **[IMDB1](IMDB/01-One-hot-encoding.ipynb)** - [Sentiment analysis with hot-one encoding](IMDB/01-One-hot-encoding.ipynb) \n",
"A basic example of sentiment analysis with sparse encoding, using a dataset from Internet Movie Database (IMDB)\n",
"- **[IMDB2](IMDB/02-Keras-embedding.ipynb)** - [Sentiment analysis with text embedding](IMDB/02-Keras-embedding.ipynb) \n",
"A very classical example of word embedding with a dataset from Internet Movie Database (IMDB)\n",
"- **[IMDB3](IMDB/03-Prediction.ipynb)** - [Reload and reuse a saved model](IMDB/03-Prediction.ipynb) \n",
"Retrieving a saved model to perform a sentiment analysis (movie review)\n",
"- **[IMDB4](IMDB/04-Show-vectors.ipynb)** - [Reload embedded vectors](IMDB/04-Show-vectors.ipynb) \n",
"Retrieving embedded vectors from our trained model\n",
"- **[IMDB5](IMDB/05-LSTM-Keras.ipynb)** - [Sentiment analysis with a RNN network](IMDB/05-LSTM-Keras.ipynb) \n",
"Still the same problem, but with a network combining embedding and RNN\n",
"### Time series with Recurrent Neural Network (RNN)\n",
"- **[LADYB1](SYNOP/LADYB1-Ladybug.ipynb)** - [Prediction of a 2D trajectory via RNN](SYNOP/LADYB1-Ladybug.ipynb) \n",
"Artificial dataset generation and prediction attempt via a recurrent network\n",
"- **[SYNOP1](SYNOP/SYNOP1-Preparation-of-data.ipynb)** - [Preparation of data](SYNOP/SYNOP1-Preparation-of-data.ipynb) \n",
"Episode 1 : Data analysis and preparation of a usuable meteorological dataset (SYNOP)\n",
"- **[SYNOP2](SYNOP/SYNOP2-First-predictions.ipynb)** - [First predictions at 3h](SYNOP/SYNOP2-First-predictions.ipynb) \n",
"Episode 2 : RNN training session for weather prediction attempt at 3h\n",
"- **[SYNOP3](SYNOP/SYNOP3-12h-predictions.ipynb)** - [12h predictions](SYNOP/SYNOP3-12h-predictions.ipynb) \n",
"Episode 3: Attempt to predict in a more longer term \n",
"### Sentiment analysis with transformer\n",
"- **[TRANS1](Transformers/01-Distilbert.ipynb)** - [IMDB, Sentiment analysis with Transformers ](Transformers/01-Distilbert.ipynb) \n",
"Using a Tranformer to perform a sentiment analysis (IMDB) - Jean Zay version\n",
"- **[TRANS2](Transformers/02-distilbert_colab.ipynb)** - [IMDB, Sentiment analysis with Transformers ](Transformers/02-distilbert_colab.ipynb) \n",
"Using a Tranformer to perform a sentiment analysis (IMDB) - Colab version\n",
"\n",
"### Unsupervised learning with an autoencoder neural network (AE)\n",
"- **[AE1](AE/01-Prepare-MNIST-dataset.ipynb)** - [Prepare a noisy MNIST dataset](AE/01-Prepare-MNIST-dataset.ipynb) \n",
"Episode 1: Preparation of a noisy MNIST dataset\n",
"- **[AE2](AE/02-AE-with-MNIST.ipynb)** - [Building and training an AE denoiser model](AE/02-AE-with-MNIST.ipynb) \n",
"Episode 1 : Construction of a denoising autoencoder and training of it with a noisy MNIST dataset.\n",
"- **[AE3](AE/03-AE-with-MNIST-post.ipynb)** - [Playing with our denoiser model](AE/03-AE-with-MNIST-post.ipynb) \n",
"Episode 2 : Using the previously trained autoencoder to denoise data\n",
"- **[AE4](AE/04-ExtAE-with-MNIST.ipynb)** - [Denoiser and classifier model](AE/04-ExtAE-with-MNIST.ipynb) \n",
"Episode 4 : Construction of a denoiser and classifier model\n",
"- **[AE5](AE/05-ExtAE-with-MNIST.ipynb)** - [Advanced denoiser and classifier model](AE/05-ExtAE-with-MNIST.ipynb) \n",
"Episode 5 : Construction of an advanced denoiser and classifier model\n",
"### Generative network with Variational Autoencoder (VAE)\n",
"- **[VAE1](VAE/01-VAE-with-MNIST.ipynb)** - [First VAE, using functional API (MNIST dataset)](VAE/01-VAE-with-MNIST.ipynb) \n",
"Construction and training of a VAE, using functional APPI, with a latent space of small dimension.\n",
"- **[VAE2](VAE/02-VAE-with-MNIST.ipynb)** - [VAE, using a custom model class (MNIST dataset)](VAE/02-VAE-with-MNIST.ipynb) \n",
"Construction and training of a VAE, using model subclass, with a latent space of small dimension.\n",
"- **[VAE3](VAE/03-VAE-with-MNIST-post.ipynb)** - [Analysis of the VAE's latent space of MNIST dataset](VAE/03-VAE-with-MNIST-post.ipynb) \n",
"Visualization and analysis of the VAE's latent space of the dataset MNIST\n",
"### Generative Adversarial Networks (GANs), using Keras\n",
"- **[SHEEP1](DCGAN.Keras/01-DCGAN-Draw-me-a-sheep.ipynb)** - [A first DCGAN to Draw a Sheep](DCGAN.Keras/01-DCGAN-Draw-me-a-sheep.ipynb) \n",
"\"Draw me a sheep\", revisited with a DCGAN\n",
"- **[SHEEP2](DCGAN.Keras/02-WGANGP-Draw-me-a-sheep.ipynb)** - [A WGAN-GP to Draw a Sheep](DCGAN.Keras/02-WGANGP-Draw-me-a-sheep.ipynb) \n",
"\"Draw me a sheep\", revisited with a WGAN-GP\n",
"\n",
"### Generative Adversarial Networks (GANs), using Lightning\n",
"- **[SHEEP3](DCGAN.Lightning/01-DCGAN-PL.ipynb)** - [A DCGAN to Draw a Sheep, using Pytorch Lightning](DCGAN.Lightning/01-DCGAN-PL.ipynb) \n",
"\"Draw me a sheep\", revisited with a DCGAN, using Pytorch Lightning\n",
"### Diffusion Model (DDPM)\n",
"- **[DDPM1](DDPM/01-ddpm.ipynb)** - [Fashion MNIST Generation with DDPM](DDPM/01-ddpm.ipynb) \n",
"Diffusion Model example, to generate Fashion MNIST images.\n",
"- **[DDPM2](DDPM/model.py)** - [DDPM Python classes](DDPM/model.py) \n",
"Python classes used by DDMP Example\n",
"\n",
"### Training optimization\n",
"- **[OPT1](Optimization/01-Apprentissages-rapides-et-Optimisations.ipynb)** - [Training setup optimization](Optimization/01-Apprentissages-rapides-et-Optimisations.ipynb) \n",
"The goal of this notebook is to go through a typical deep learning model training\n",
"\n",
"### Deep Reinforcement Learning (DRL)\n",
"- **[DRL1](DRL/FIDLE_DQNfromScratch.ipynb)** - [Solving CartPole with DQN](DRL/FIDLE_DQNfromScratch.ipynb) \n",
"Using a a Deep Q-Network to play CartPole - an inverted pendulum problem (PyTorch)\n",
"- **[DRL2](DRL/FIDLE_rl_baselines_zoo.ipynb)** - [RL Baselines3 Zoo: Training in Colab](DRL/FIDLE_rl_baselines_zoo.ipynb) \n",
"Demo of Stable baseline3 with Colab\n",
"\n",
"### Miscellaneous things, but very important!\n",
"- **[NP1](Misc/00-Numpy.ipynb)** - [A short introduction to Numpy](Misc/00-Numpy.ipynb) \n",
"Numpy is an essential tool for the Scientific Python.\n",
"- **[ACTF1](Misc/01-Activation-Functions.ipynb)** - [Activation functions](Misc/01-Activation-Functions.ipynb) \n",
"Some activation functions, with their derivatives.\n",
"- **[PANDAS1](Misc/02-Using-pandas.ipynb)** - [Quelques exemples avec Pandas](Misc/02-Using-pandas.ipynb) \n",
"pandas is another essential tool for the Scientific Python.\n",
"- **[PYTORCH1](Misc/03-Using-Pytorch.ipynb)** - [Practical Lab : PyTorch](Misc/03-Using-Pytorch.ipynb) \n",
"PyTorch est l'un des principaux framework utilisé dans le Deep Learning\n",
"- **[TSB1](Misc/04-Using-Tensorboard.ipynb)** - [Tensorboard with/from Jupyter ](Misc/04-Using-Tensorboard.ipynb) \n",
"4 ways to use Tensorboard from the Jupyter environment\n",
"- **[SCRATCH1](Misc/99-Scratchbook.ipynb)** - [Scratchbook](Misc/99-Scratchbook.ipynb) \n",
"A scratchbook for small examples\n",
"\n",
"\n",
"## Installation\n",
"\n",
"Have a look about **[How to get and install](https://fidle.cnrs.fr/installation)** these notebooks and datasets.\n",
"\n",
"## Licence\n",
"\n",
"[<img width=\"100px\" src=\"fidle/img/00-fidle-CC BY-NC-SA.svg\"></img>](https://creativecommons.org/licenses/by-nc-sa/4.0/) \n",
"\\[en\\] Attribution - NonCommercial - ShareAlike 4.0 International (CC BY-NC-SA 4.0) \n",
"\\[Fr\\] Attribution - Pas d’Utilisation Commerciale - Partage dans les Mêmes Conditions 4.0 International \n",
"See [License](https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode). \n",
"See [Disclaimer](https://creativecommons.org/licenses/by-nc-sa/4.0/#). \n",
"\n",
"\n",
"----\n",
"[<img width=\"80px\" src=\"fidle/img/logo-paysage.svg\"></img>](#top)\n"
],
"text/plain": [
"<IPython.core.display.Markdown object>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"from IPython.display import display,Markdown\n",
"display(Markdown(open('README.md', 'r').read()))\n",
"#\n",
"# This README is visible under Jupiter Lab ;-)# Automatically generated on : 06/11/23 15:13:53"
]
}
],
"metadata": {
"kernelspec": {
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",