A propos
This repository contains all the documents and links of the Fidle Training.
The objectives of this training, co-organized by the Formation Permanente CNRS and the SARI and DEVLOG networks, are :
- Understanding the bases of Deep Learning neural networks
- Develop a first experience through simple and representative examples
- Understanding Tensorflow/Keras and Jupyter lab technologies
- Apprehend the academic computing environments Tier-2 or Tier-1 with powerfull GPU
Current Version : 0.2
Course materials
Useful information is also available in the wiki
Jupyter notebooks
[LINR1] - Linear regression with direct resolution
Direct determination of linear regression
[GRAD1] - Linear regression with gradient descent
An example of gradient descent in the simple case of a linear regression.
[POLR1] - Complexity Syndrome
Illustration of the problem of complexity with the polynomial regression
[LOGR1] - Logistic regression, in pure Tensorflow
Logistic Regression with Mini-Batch Gradient Descent using pure TensorFlow.
[PER57] - Perceptron Model 1957
A simple perceptron, with the IRIS dataset.
[BHP1] - Regression with a Dense Network (DNN)
A Simple regression with a Dense Neural Network (DNN) - BHPD dataset
[BHP2] - Regression with a Dense Network (DNN) - Advanced code
More advanced example of DNN network code - BHPD dataset
[MNIST1] - Simple classification with DNN
Example of classification with a fully connected neural network
[GTS1] - CNN with GTSRB dataset - Data analysis and preparation
Episode 1 : Data analysis and creation of a usable dataset
[GTS2] - CNN with GTSRB dataset - First convolutions
Episode 2 : First convolutions and first results
[GTS3] - CNN with GTSRB dataset - Monitoring
Episode 3 : Monitoring and analysing training, managing checkpoints
[GTS4] - CNN with GTSRB dataset - Data augmentation
Episode 4 : Improving the results with data augmentation
[GTS5] - CNN with GTSRB dataset - Full convolutions
Episode 5 : A lot of models, a lot of datasets and a lot of results.
[GTS6] - CNN with GTSRB dataset - Full convolutions as a batch
Episode 6 : Run Full convolution notebook as a batch
[GTS7] - Full convolutions Report
Episode 7 : Displaying the reports of the different jobs
[TSB1] - Tensorboard with/from Jupyter
4 ways to use Tensorboard from the Jupyter environment
[IMDB1] - Text embedding with IMDB
A very classical example of word embedding for text classification (sentiment analysis)
[IMDB2] - Text embedding with IMDB - Reloaded
Example of reusing a previously saved model
[IMDB3] - Text embedding/LSTM model with IMDB
Still the same problem, but with a network combining embedding and LSTM
[VAE1] - Variational AutoEncoder (VAE) with MNIST
First generative network experience with the MNIST dataset
[VAE2] - Variational AutoEncoder (VAE) with MNIST - Analysis
Use of the previously trained model, analysis of the results
[VAE3] - About the CelebA dataset
New VAE experience, but with a larger and more fun dataset
[VAE4] - Preparation of the CelebA dataset
Preparation of a clustered dataset, batchable
[VAE5] - Checking the clustered CelebA dataset
Verification of prepared data from CelebA dataset
[VAE6] - Variational AutoEncoder (VAE) with CelebA (small)
VAE with a more fun and realistic dataset - small resolution and batchable
[VAE7] - Variational AutoEncoder (VAE) with CelebA (medium)
VAE with a more fun and realistic dataset - medium resolution and batchable
[VAE12] - Variational AutoEncoder (VAE) with CelebA - Analysis
Use of the previously trained model with CelebA, analysis of the results
[BASH1] - OAR batch script
Bash script for OAR batch submission of a notebook
[BASH2] - SLURM batch script
Bash script for SLURM batch submission of a notebook
[ACTF1] - Activation functions
Some activation functions, with their derivatives.
[NP1] - A short introduction to Numpy
Numpy is an essential tool for the Scientific Python.
Installation
A procedure for configuring and starting Jupyter is available in the Wiki.
Licence
[en] Attribution - NonCommercial - ShareAlike 4.0 International (CC BY-NC-SA 4.0)
[Fr] Attribution - Pas d’Utilisation Commerciale - Partage dans les Mêmes Conditions 4.0 International
See License.
See Disclaimer.