Newer
Older
"iopub.execute_input": "2021-01-08T21:53:24.359601Z",
"iopub.status.busy": "2021-01-08T21:53:24.359126Z",
"iopub.status.idle": "2021-01-08T21:53:24.367752Z",
"shell.execute_reply": "2021-01-08T21:53:24.368077Z"
"jupyter": {
"source_hidden": true
}
},
"outputs": [
{
"data": {
"text/markdown": [
"<a name=\"top\"></a>\n",
"\n",
"[<img width=\"600px\" src=\"fidle/img/00-Fidle-titre-01.svg\"></img>](#top)\n",
"<!-- --------------------------------------------------- -->\n",
"<!-- To correctly view this README under Jupyter Lab -->\n",
"<!-- Open the notebook: README.ipynb! -->\n",
"<!-- --------------------------------------------------- -->\n",
"\n",
"This repository contains all the documents and links of the **Fidle Training** . \n",
"Fidle (for Formation Introduction au Deep Learning) is a 2-day training session \n",
"co-organized by the Formation Permanente CNRS and the SARI and DEVLOG networks. \n",
"The objectives of this training are :\n",
" - Understanding the **bases of Deep Learning** neural networks\n",
" - Develop a **first experience** through simple and representative examples\n",
" - Understanding **Tensorflow/Keras** and **Jupyter lab** technologies\n",
" - Apprehend the **academic computing environments** Tier-2 or Tier-1 with powerfull GPU\n",
"\n",
"For more information, you can contact us at : \n",
"[<img width=\"200px\" style=\"vertical-align:middle\" src=\"fidle/img/00-Mail_contact.svg\"></img>](#top) \n",
"Current Version : <!-- VERSION_BEGIN -->\n",
"| | | |\n",
"|:--:|:--:|:--:|\n",
"| **[<img width=\"50px\" src=\"fidle/img/00-Fidle-pdf.svg\"></img><br>Course slides](https://cloud.univ-grenoble-alpes.fr/index.php/s/wxCztjYBbQ6zwd6)**<br>The course in pdf format<br>(12 Mo)| **[<img width=\"50px\" src=\"fidle/img/00-Notebooks.svg\"></img><br>Notebooks](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/archive/master/fidle-master.zip)**<br> Get a Zip or clone this repository <br>(10 Mo)| **[<img width=\"50px\" src=\"fidle/img/00-Datasets-tar.svg\"></img><br>Datasets](https://cloud.univ-grenoble-alpes.fr/index.php/s/wxCztjYBbQ6zwd6)**<br>All the needed datasets<br>(1.2 Go)|\n",
"\n",
"Have a look about **[How to get and install](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Install-Fidle)** these notebooks and datasets.\n",
"\n",
"\n",
"## Jupyter notebooks\n",
"\n",
"<!-- INDEX_BEGIN -->\n",
"### Linear and logistic regression\n",
"- **[LINR1](LinearReg/01-Linear-Regression.ipynb)** - [Linear regression with direct resolution](LinearReg/01-Linear-Regression.ipynb) \n",
"Low-level implementation, using numpy, of a direct resolution for a linear regression\n",
"- **[GRAD1](LinearReg/02-Gradient-descent.ipynb)** - [Linear regression with gradient descent](LinearReg/02-Gradient-descent.ipynb) \n",
"Low level implementation of a solution by gradient descent. Basic and stochastic approach.\n",
"- **[POLR1](LinearReg/03-Polynomial-Regression.ipynb)** - [Complexity Syndrome](LinearReg/03-Polynomial-Regression.ipynb) \n",
"Illustration of the problem of complexity with the polynomial regression\n",
"- **[LOGR1](LinearReg/04-Logistic-Regression.ipynb)** - [Logistic regression](LinearReg/04-Logistic-Regression.ipynb) \n",
"Simple example of logistic regression with a sklearn solution\n",
"\n",
"### Perceptron Model 1957\n",
"- **[PER57](IRIS/01-Simple-Perceptron.ipynb)** - [Perceptron Model 1957](IRIS/01-Simple-Perceptron.ipynb) \n",
"Example of use of a Perceptron, with sklearn and IRIS dataset of 1936 !\n",
"\n",
"### Basic regression using DNN\n",
"- **[BHPD1](BHPD/01-DNN-Regression.ipynb)** - [Regression with a Dense Network (DNN)](BHPD/01-DNN-Regression.ipynb) \n",
"Simple example of a regression with the dataset Boston Housing Prices Dataset (BHPD)\n",
"- **[BHPD2](BHPD/02-DNN-Regression-Premium.ipynb)** - [Regression with a Dense Network (DNN) - Advanced code](BHPD/02-DNN-Regression-Premium.ipynb) \n",
"A more advanced implementation of the precedent example\n",
"\n",
"### Basic classification using a DNN\n",
"- **[MNIST1](MNIST/01-DNN-MNIST.ipynb)** - [Simple classification with DNN](MNIST/01-DNN-MNIST.ipynb) \n",
"An example of classification using a dense neural network for the famous MNIST dataset\n",
"\n",
"### Images classification with Convolutional Neural Networks (CNN)\n",
"- **[GTSRB1](GTSRB/01-Preparation-of-data.ipynb)** - [Dataset analysis and preparation](GTSRB/01-Preparation-of-data.ipynb) \n",
"Episode 1 : Analysis of the GTSRB dataset and creation of an enhanced dataset\n",
"- **[GTSRB2](GTSRB/02-First-convolutions.ipynb)** - [First convolutions](GTSRB/02-First-convolutions.ipynb) \n",
"Episode 2 : First convolutions and first classification of our traffic signs\n",
"- **[GTSRB3](GTSRB/03-Tracking-and-visualizing.ipynb)** - [Training monitoring](GTSRB/03-Tracking-and-visualizing.ipynb) \n",
"Episode 3 : Monitoring, analysis and check points during a training session\n",
"- **[GTSRB4](GTSRB/04-Data-augmentation.ipynb)** - [Data augmentation ](GTSRB/04-Data-augmentation.ipynb) \n",
"Episode 4 : Adding data by data augmentation when we lack it, to improve our results\n",
"- **[GTSRB5](GTSRB/05-Full-convolutions.ipynb)** - [Full convolutions](GTSRB/05-Full-convolutions.ipynb) \n",
"Episode 5 : A lot of models, a lot of datasets and a lot of results.\n",
"- **[GTSRB6](GTSRB/06-Notebook-as-a-batch.ipynb)** - [Full convolutions as a batch](GTSRB/06-Notebook-as-a-batch.ipynb) \n",
"Episode 6 : To compute bigger, use your notebook in batch mode\n",
"- **[GTSRB7](GTSRB/07-Show-report.ipynb)** - [Batch reportss](GTSRB/07-Show-report.ipynb) \n",
"Episode 7 : Displaying our jobs report, and the winner is...\n",
"- **[GTSRB10](GTSRB/batch_oar.sh)** - [OAR batch script submission](GTSRB/batch_oar.sh) \n",
"Bash script for an OAR batch submission of an ipython code\n",
"- **[GTSRB11](GTSRB/batch_slurm.sh)** - [SLURM batch script](GTSRB/batch_slurm.sh) \n",
"Bash script for a Slurm batch submission of an ipython code\n",
"\n",
"### Sentiment analysis with word embedding\n",
"- **[IMDB1](IMDB/01-Embedding-Keras.ipynb)** - [Sentiment alalysis with text embedding](IMDB/01-Embedding-Keras.ipynb) \n",
"A very classical example of word embedding with a dataset from Internet Movie Database (IMDB)\n",
"- **[IMDB2](IMDB/02-Prediction.ipynb)** - [Reload and reuse a saved model](IMDB/02-Prediction.ipynb) \n",
"Retrieving a saved model to perform a sentiment analysis (movie review)\n",
"- **[IMDB3](IMDB/03-LSTM-Keras.ipynb)** - [Sentiment analysis with a LSTM network](IMDB/03-LSTM-Keras.ipynb) \n",
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
"Still the same problem, but with a network combining embedding and LSTM\n",
"\n",
"### Time series with Recurrent Neural Network (RNN)\n",
"- **[SYNOP1](SYNOP/01-Preparation-of-data.ipynb)** - [Time series with RNN - Preparation of data](SYNOP/01-Preparation-of-data.ipynb) \n",
"Episode 1 : Data analysis and creation of a usable dataset\n",
"- **[SYNOP2](SYNOP/02-First-predictions.ipynb)** - [Time series with RNN - Try a prediction](SYNOP/02-First-predictions.ipynb) \n",
"Episode 2 : Training session and first predictions\n",
"- **[SYNOP3](SYNOP/03-12h-predictions.ipynb)** - [Time series with RNN - 12h predictions](SYNOP/03-12h-predictions.ipynb) \n",
"Episode 3: Attempt to predict in the longer term \n",
"\n",
"### Unsupervised learning with an autoencoder neural network (AE)\n",
"- **[AE1](AE/01-AE-with-MNIST.ipynb)** - [AutoEncoder (AE) with MNIST](AE/01-AE-with-MNIST.ipynb) \n",
"Episode 1 : Model construction and Training\n",
"- **[AE2](AE/02-AE-with-MNIST-post.ipynb)** - [AutoEncoder (AE) with MNIST - Analysis](AE/02-AE-with-MNIST-post.ipynb) \n",
"Episode 2 : Exploring our denoiser\n",
"\n",
"### Generative network with Variational Autoencoder (VAE)\n",
"- **[VAE1](VAE/01-VAE-with-MNIST.ipynb)** - [Variational AutoEncoder (VAE) with MNIST](VAE/01-VAE-with-MNIST.ipynb) \n",
"Building a simple model with the MNIST dataset\n",
"- **[VAE2](VAE/02-VAE-with-MNIST-post.ipynb)** - [Variational AutoEncoder (VAE) with MNIST - Analysis](VAE/02-VAE-with-MNIST-post.ipynb) \n",
"Visualization and analysis of latent space\n",
"- **[VAE3](VAE/05-About-CelebA.ipynb)** - [About the CelebA dataset](VAE/05-About-CelebA.ipynb) \n",
"Presentation of the CelebA dataset and problems related to its size\n",
"- **[VAE6](VAE/06-Prepare-CelebA-datasets.ipynb)** - [Preparation of the CelebA dataset](VAE/06-Prepare-CelebA-datasets.ipynb) \n",
"Preparation of a clustered dataset, batchable\n",
"- **[VAE7](VAE/07-Check-CelebA.ipynb)** - [Checking the clustered CelebA dataset](VAE/07-Check-CelebA.ipynb) \n",
"Check the clustered dataset\n",
"- **[VAE8](VAE/08-VAE-with-CelebA==1090048==.ipynb)** - [Variational AutoEncoder (VAE) with CelebA (small)](VAE/08-VAE-with-CelebA==1090048==.ipynb) \n",
"Variational AutoEncoder (VAE) with CelebA (small res. 128x128)\n",
"- **[VAE9](VAE/09-VAE-withCelebA-post.ipynb)** - [Variational AutoEncoder (VAE) with CelebA - Analysis](VAE/09-VAE-withCelebA-post.ipynb) \n",
"Exploring latent space of our trained models\n",
"- **[VAE10](VAE/batch_slurm.sh)** - [SLURM batch script](VAE/batch_slurm.sh) \n",
"Bash script for SLURM batch submission of VAE notebooks \n",
"\n",
"### Miscellaneous\n",
"- **[ACTF1](Misc/Activation-Functions.ipynb)** - [Activation functions](Misc/Activation-Functions.ipynb) \n",
"Some activation functions, with their derivatives.\n",
"- **[NP1](Misc/Numpy.ipynb)** - [A short introduction to Numpy](Misc/Numpy.ipynb) \n",
"Numpy is an essential tool for the Scientific Python.\n",
"- **[TSB1](Misc/Using-Tensorboard.ipynb)** - [Tensorboard with/from Jupyter ](Misc/Using-Tensorboard.ipynb) \n",
"4 ways to use Tensorboard from the Jupyter environment\n",
"<!-- INDEX_END -->\n",
"\n",
"\n",
"## Installation\n",
"\n",
"A procedure for **configuring** and **starting Jupyter** is available in the **[Wiki](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Install-Fidle)**.\n",
"\n",
"## Licence\n",
"\n",
"[<img width=\"100px\" src=\"fidle/img/00-fidle-CC BY-NC-SA.svg\"></img>](https://creativecommons.org/licenses/by-nc-sa/4.0/) \n",
"\\[en\\] Attribution - NonCommercial - ShareAlike 4.0 International (CC BY-NC-SA 4.0) \n",
"\\[Fr\\] Attribution - Pas d’Utilisation Commerciale - Partage dans les Mêmes Conditions 4.0 International \n",
"See [License](https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode). \n",
"See [Disclaimer](https://creativecommons.org/licenses/by-nc-sa/4.0/#). \n",
"\n",
"\n",
"----\n",
"[<img width=\"80px\" src=\"fidle/img/00-Fidle-logo-01.svg\"></img>](#top)\n"
],
"text/plain": [
"<IPython.core.display.Markdown object>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"from IPython.display import display,Markdown\n",
"display(Markdown(open('README.md', 'r').read()))\n",
"#\n",
"# This README is visible under Jupiter LAb ! :-)"
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
}
},
"nbformat": 4,
"nbformat_minor": 4
}