Newer
Older
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
"cell_type": "code",
"execution_count": 1,
"metadata": {
"jupyter": {
"source_hidden": true
}
},
"outputs": [
{
"data": {
"text/markdown": [
"[<img width=\"600px\" src=\"fidle/img/00-Fidle-titre-01.svg\"></img>](#)\n",
"\n",
"## A propos\n",
"\n",
"This repository contains all the documents and links of the **Fidle Training**. \n",
"\n",
"The objectives of this training, co-organized by the Formation Permanente CNRS and the SARI and DEVLOG networks, are :\n",
" - Understanding the **bases of deep learning** neural networks (Deep Learning)\n",
" - Develop a **first experience** through simple and representative examples\n",
" - Understand the different types of networks, their **architectures** and their **use cases**.\n",
" - Understanding **Tensorflow/Keras and Jupyter lab** technologies on the GPU\n",
" - Apprehend the **academic computing environments** Tier-2 (meso) and/or Tier-1 (national)\n",
"\n",
"## Course materials\n",
"**[<img width=\"50px\" src=\"fidle/img/00-Fidle-pdf.svg\"></img>\n",
"Get the course slides](https://cloud.univ-grenoble-alpes.fr/index.php/s/z7XZA36xKkMcaTS)** \n",
"\n",
"\n",
"\n",
"<!--  -->\n",
"Useful information is also available in the [wiki](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/home)\n",
"\n",
"\n",
"## Jupyter notebooks\n",
"\n",
"[](https://mybinder.org/v2/git/https%3A%2F%2Fgricad-gitlab.univ-grenoble-alpes.fr%2Ftalks%2Fdeeplearning.git/master?urlpath=lab/tree/index.ipynb)\n",
"\n",
"\n",
"<!-- DO NOT REMOVE THIS TAG !!! -->\n",
"<!-- INDEX -->\n",
"<!-- INDEX_BEGIN -->\n",
"[[NP1] - A short introduction to Numpy](Prerequisites/Numpy.ipynb) \n",
" Numpy is an essential tool for the Scientific Python. \n",
"[[LINR1] - Linear regression with direct resolution](LinearReg/01-Linear-Regression.ipynb) \n",
" Direct determination of linear regression \n",
"[[GRAD1] - Linear regression with gradient descent](LinearReg/02-Gradient-descent.ipynb) \n",
" An example of gradient descent in the simple case of a linear regression. \n",
"[[POLR1] - Complexity Syndrome](LinearReg/03-Polynomial-Regression.ipynb) \n",
" Illustration of the problem of complexity with the polynomial regression \n",
"[[LOGR1] - Logistic regression, in pure Tensorflow](LinearReg/04-Logistic-Regression.ipynb) \n",
" Logistic Regression with Mini-Batch Gradient Descent using pure TensorFlow. \n",
"[[MNIST1] - Simple classification with DNN](MNIST/01-DNN-MNIST.ipynb) \n",
" Example of classification with a fully connected neural network \n",
"[[BHP1] - Regression with a Dense Network (DNN)](BHPD/01-DNN-Regression.ipynb) \n",
" A Simple regression with a Dense Neural Network (DNN) - BHPD dataset \n",
"[[BHP2] - Regression with a Dense Network (DNN) - Advanced code](BHPD/02-DNN-Regression-Premium.ipynb) \n",
" More advanced example of DNN network code - BHPD dataset \n",
"[[GTS1] - CNN with GTSRB dataset - Data analysis and preparation](GTSRB/01-Preparation-of-data.ipynb) \n",
" Episode 1: Data analysis and creation of a usable dataset \n",
"[[GTS2] - CNN with GTSRB dataset - First convolutions](GTSRB/02-First-convolutions.ipynb) \n",
" Episode 2 : First convolutions and first results \n",
"[[GTS3] - CNN with GTSRB dataset - Monitoring ](GTSRB/03-Tracking-and-visualizing.ipynb) \n",
" Episode 3: Monitoring and analysing training, managing checkpoints \n",
"[[GTS4] - CNN with GTSRB dataset - Data augmentation ](GTSRB/04-Data-augmentation.ipynb) \n",
" Episode 4: Improving the results with data augmentation \n",
"[[GTS5] - CNN with GTSRB dataset - Full convolutions ](GTSRB/05-Full-convolutions.ipynb) \n",
" Episode 5: A lot of models, a lot of datasets and a lot of results. \n",
"[[GTS6] - CNN with GTSRB dataset - Full convolutions as a batch](GTSRB/06-Full-convolutions-batch.ipynb) \n",
" Episode 6 : Run Full convolution notebook as a batch \n",
"[[GTS7] - Full convolutions Report](GTSRB/07-Full-convolutions-reports.ipynb) \n",
" Displaying the reports of the different jobs \n",
"[[TSB1] - Tensorboard with/from Jupyter ](GTSRB/99-Scripts-Tensorboard.ipynb) \n",
" 4 ways to use Tensorboard from the Jupyter environment \n",
"[[IMDB1] - Text embedding with IMDB](IMDB/01-Embedding-Keras.ipynb) \n",
" A very classical example of word embedding for text classification (sentiment analysis) \n",
"[[IMDB2] - Text embedding with IMDB - Reloaded](IMDB/02-Prediction.ipynb) \n",
" Example of reusing a previously saved model \n",
"[[IMDB3] - Text embedding/LSTM model with IMDB](IMDB/03-LSTM-Keras.ipynb) \n",
" Still the same problem, but with a network combining embedding and LSTM \n",
"[[VAE1] - Variational AutoEncoder (VAE) with MNIST](VAE/01-VAE-with-MNIST.ipynb) \n",
" First generative network experience with the MNIST dataset \n",
"[[VAE2] - Variational AutoEncoder (VAE) with MNIST - Analysis](VAE/02-VAE-with-MNIST-post.ipynb) \n",
" Use of the previously trained model, analysis of the results \n",
"[[VAE3] - About the CelebA dataset](VAE/03-Prepare-CelebA.ipynb) \n",
" New VAE experience, but with a larger and more fun dataset \n",
"[[VAE4] - Preparation of the CelebA dataset](VAE/04-Prepare-CelebA-batch.ipynb) \n",
" Preparation of a clustered dataset, batchable \n",
"[[VAE5] - Checking the clustered CelebA dataset](VAE/05-Check-CelebA.ipynb) \n",
" Verification of prepared data from CelebA dataset \n",
"[[VAE6] - Variational AutoEncoder (VAE) with CelebA (small)](VAE/06-VAE-with-CelebA-s.ipynb) \n",
" VAE with a more fun and realistic dataset - small resolution and batchable \n",
"[[VAE7] - Variational AutoEncoder (VAE) with CelebA (medium)](VAE/07-VAE-with-CelebA-m.ipynb) \n",
" VAE with a more fun and realistic dataset - medium resolution and batchable \n",
"[[VAE12] - Variational AutoEncoder (VAE) with CelebA - Analysis](VAE/12-VAE-withCelebA-post.ipynb) \n",
" Use of the previously trained model with CelebA, analysis of the results \n",
"[[BASH1] - OAR batch script](VAE/batch-oar.sh) \n",
" Bash script for OAR batch submission of a notebook \n",
"[[BASH2] - SLURM batch script](VAE/batch-slurm.sh) \n",
" Bash script for SLURM batch submission of a notebook \n",
"<!-- INDEX_END -->\n",
"\n",
"\n",
"## Installation\n",
"\n",
"A procedure for **configuring** and **starting Jupyter** is available in the **[Wiki](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/howto-jupyter)**.\n",
"\n",
"## Licence\n",
"\n",
"[<img width=\"100px\" src=\"fidle/img/00-fidle-CC BY-NC-SA.svg\"></img>](https://creativecommons.org/licenses/by-nc-sa/4.0/) \n",
"\\[en\\] Attribution - NonCommercial - ShareAlike 4.0 International (CC BY-NC-SA 4.0) \n",
"\\[Fr\\] Attribution - Pas d’Utilisation Commerciale - Partage dans les Mêmes Conditions 4.0 International \n",
"See [License](https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode). \n",
"See [Disclaimer](https://creativecommons.org/licenses/by-nc-sa/4.0/#). \n",
"\n",
"\n",
"----\n",
"[<img width=\"80px\" src=\"fidle/img/00-Fidle-logo-01.svg\"></img>](#)"
],
"text/plain": [
"<IPython.core.display.Markdown object>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"from IPython.display import display,Markdown\n",
"display(Markdown(open('README.md', 'r').read()))"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.6"
}
},
"nbformat": 4,
"nbformat_minor": 4
}