Skip to content
Snippets Groups Projects

A propos

This repository contains all the documents and links of the Fidle Training .
Fidle (for Formation Introduction au Deep Learning) is a 2-day training session
co-organized by the Formation Permanente CNRS and the SARI and DEVLOG networks.

The objectives of this training are :

  • Understanding the bases of Deep Learning neural networks
  • Develop a first experience through simple and representative examples
  • Understanding Tensorflow/Keras and Jupyter lab technologies
  • Apprehend the academic computing environments Tier-2 or Tier-1 with powerfull GPU

For more information, you can contact us at :
Current Version : 1.2b1 DEV

Course materials


Course slides

The course in pdf format
(12 Mo)

Notebooks

    Get a Zip or clone this repository     
(10 Mo)

Datasets

All the needed datasets
(1.2 Go)

Have a look about How to get and install these notebooks and datasets.

Jupyter notebooks

.fid_line{ padding-top: 10px } .fid_id { font-size:1.em; color:black; font-weight: bold; padding:0px; margin-left: 20px; display: inline-block; width: 60px; } .fid_desc { font-size:1.em; padding:0px; margin-left: 85px; display: inline-block; width: 600px; } div.fid_section { font-size:1.2em; color:black; margin-left: 0px; margin-top: 12px; margin-bottom:8px; border-bottom: solid; border-block-width: 1px; border-block-color: #dadada; width: 700px; }
Linear and logistic regression
LINR1 Linear regression with direct resolution
Direct determination of linear regression
GRAD1 Linear regression with gradient descent
An example of gradient descent in the simple case of a linear regression.
POLR1 Complexity Syndrome
Illustration of the problem of complexity with the polynomial regression
LOGR1 Logistic regression, with sklearn
Logistic Regression using Sklearn
Perceptron Model 1957
PER57 Perceptron Model 1957
A simple perceptron, with the IRIS dataset.
Basic regression using DNN
BHPD1 Regression with a Dense Network (DNN)
A Simple regression with a Dense Neural Network (DNN) - BHPD dataset
BHPD2 Regression with a Dense Network (DNN) - Advanced code
More advanced example of DNN network code - BHPD dataset
Basic classification using a DNN
MNIST1 Simple classification with DNN
Example of classification with a fully connected neural network
Images classification with Convolutional Neural Networks (CNN)
GTSRB1 CNN with GTSRB dataset - Data analysis and preparation
Episode 1 : Data analysis and creation of a usable dataset
GTSRB2 CNN with GTSRB dataset - First convolutions
Episode 2 : First convolutions and first results
GTSRB3 CNN with GTSRB dataset - Monitoring
Episode 3 : Monitoring and analysing training, managing checkpoints
GTSRB4 CNN with GTSRB dataset - Data augmentation
Episode 4 : Improving the results with data augmentation
GTSRB5 CNN with GTSRB dataset - Full convolutions
Episode 5 : A lot of models, a lot of datasets and a lot of results.
GTSRB6 Full convolutions as a batch
Episode 6 : Run Full convolution notebook as a batch
GTSRB7 CNN with GTSRB dataset - Show reports
Episode 7 : Displaying a jobs report
GTSRB10 OAR batch submission
Bash script for OAR batch submission of GTSRB notebook
GTSRB11 SLURM batch script
Bash script for SLURM batch submission of GTSRB notebooks
Sentiment analysis with word embedding
IMDB1 Text embedding with IMDB
A very classical example of word embedding for text classification (sentiment analysis)
IMDB2 Text embedding with IMDB - Reloaded
Example of reusing a previously saved model
IMDB3 Text embedding/LSTM model with IMDB
Still the same problem, but with a network combining embedding and LSTM
Time series with Recurrent Neural Network (RNN)
SYNOP1 Time series with RNN - Preparation of data
Episode 1 : Data analysis and creation of a usable dataset
SYNOP2 Time series with RNN - Try a prediction
Episode 2 : Training session and first predictions
SYNOP3 Time series with RNN - 12h predictions
Episode 3: Attempt to predict in the longer term
Unsupervised learning with an autoencoder neural network (AE)
AE1 AutoEncoder (AE) with MNIST
Episode 1 : Model construction and Training
AE2 AutoEncoder (AE) with MNIST - Analysis
Episode 2 : Exploring our denoiser
Generative network with Variational Autoencoder (VAE)
VAE1 Variational AutoEncoder (VAE) with MNIST
Building a simple model with the MNIST dataset
VAE2 Variational AutoEncoder (VAE) with MNIST - Analysis
Visualization and analysis of latent space
VAE3 About the CelebA dataset
Presentation of the CelebA dataset and problems related to its size
VAE6 Preparation of the CelebA dataset
Preparation of a clustered dataset, batchable
VAE7 Checking the clustered CelebA dataset
Check the clustered dataset
VAE8 Variational AutoEncoder (VAE) with CelebA (small)
Variational AutoEncoder (VAE) with CelebA (small res. 128x128)
VAE9 Variational AutoEncoder (VAE) with CelebA - Analysis
Exploring latent space of our trained models
VAE10 SLURM batch script
Bash script for SLURM batch submission of VAE notebooks
Miscellaneous
ACTF1 Activation functions
Some activation functions, with their derivatives.
NP1 A short introduction to Numpy
Numpy is an essential tool for the Scientific Python.
TSB1 Tensorboard with/from Jupyter
4 ways to use Tensorboard from the Jupyter environment

Installation

A procedure for configuring and starting Jupyter is available in the Wiki.

Licence


[en] Attribution - NonCommercial - ShareAlike 4.0 International (CC BY-NC-SA 4.0)
[Fr] Attribution - Pas d’Utilisation Commerciale - Partage dans les Mêmes Conditions 4.0 International
See License.
See Disclaimer.