Skip to content
Snippets Groups Projects
Forked from Talks / Fidle
370 commits behind the upstream repository.
user avatar
Jean-Luc Parouty authored
5264f262
History

A propos

This repository contains all the documents and links of the Fidle Training .
Fidle (for Formation Introduction au Deep Learning) is a 2-day training session
co-organized by the Formation Permanente CNRS and the SARI and DEVLOG networks.

The objectives of this training are :

  • Understanding the bases of Deep Learning neural networks
  • Develop a first experience through simple and representative examples
  • Understanding Tensorflow/Keras and Jupyter lab technologies
  • Apprehend the academic computing environments Tier-2 or Tier-1 with powerfull GPU

For more information, you can contact us at :
Current Version : 0.6.1 DEV

Course materials


Course slides

The course in pdf format
(12 Mo)

Notebooks

    Get a Zip or clone this repository     
(10 Mo)

Datasets

All the needed datasets
(1.2 Go)

Have a look about How to get and install these notebooks and datasets.

Jupyter notebooks

LINR1 Linear regression with direct resolution
Direct determination of linear regression
GRAD1 Linear regression with gradient descent
An example of gradient descent in the simple case of a linear regression.
POLR1 Complexity Syndrome
Illustration of the problem of complexity with the polynomial regression
LOGR1 Logistic regression, with sklearn
Logistic Regression using Sklearn
PER57 Perceptron Model 1957
A simple perceptron, with the IRIS dataset.
BHP1 Regression with a Dense Network (DNN)
A Simple regression with a Dense Neural Network (DNN) - BHPD dataset
BHP2 Regression with a Dense Network (DNN) - Advanced code
More advanced example of DNN network code - BHPD dataset
MNIST1 Simple classification with DNN
Example of classification with a fully connected neural network
GTS1 CNN with GTSRB dataset - Data analysis and preparation
Episode 1 : Data analysis and creation of a usable dataset
GTS2 CNN with GTSRB dataset - First convolutions
Episode 2 : First convolutions and first results
GTS3 CNN with GTSRB dataset - Monitoring
Episode 3 : Monitoring and analysing training, managing checkpoints
GTS4 CNN with GTSRB dataset - Data augmentation
Episode 4 : Improving the results with data augmentation
GTS5 CNN with GTSRB dataset - Full convolutions
Episode 5 : A lot of models, a lot of datasets and a lot of results.
GTS6 Full convolutions as a batch
Episode 6 : Run Full convolution notebook as a batch
GTS7 CNN with GTSRB dataset - Show reports
Episode 7 : Displaying a jobs report
TSB1 Tensorboard with/from Jupyter
4 ways to use Tensorboard from the Jupyter environment
GTS8 OAR batch submission
Bash script for OAR batch submission of GTSRB notebook
GTS9 Slurm batch submission
Bash script Slurm batch submission of GTSRB notebook
IMDB1 Text embedding with IMDB
A very classical example of word embedding for text classification (sentiment analysis)
IMDB2 Text embedding with IMDB - Reloaded
Example of reusing a previously saved model
IMDB3 Text embedding/LSTM model with IMDB
Still the same problem, but with a network combining embedding and LSTM
SYNOP1 Time series with RNN - Preparation of data
Episode 1 : Data analysis and creation of a usable dataset
SYNOP2 Time series with RNN - Try a prediction
Episode 2 : Training session and first predictions
SYNOP3 Time series with RNN - 12h predictions
Episode 3: Attempt to predict in the longer term
VAE1 Variational AutoEncoder (VAE) with MNIST
Episode 1 : Model construction and Training
VAE2 Variational AutoEncoder (VAE) with MNIST - Analysis
Episode 2 : Exploring our latent space
VAE3 About the CelebA dataset
Episode 3 : About the CelebA dataset, a more fun dataset ;-)
VAE4 Preparation of the CelebA dataset
Episode 4 : Preparation of a clustered dataset, batchable
VAE5 Checking the clustered CelebA dataset
Episode 5 : Checking the clustered dataset
VAE6 Variational AutoEncoder (VAE) with CelebA (small)
Episode 6 : Variational AutoEncoder (VAE) with CelebA (small res.)
VAE7 Variational AutoEncoder (VAE) with CelebA (medium)
Episode 7 : Variational AutoEncoder (VAE) with CelebA (medium res.)
VAE8 Variational AutoEncoder (VAE) with CelebA - Analysis
Episode 8 : Exploring latent space of our trained models
BASH1 OAR batch script
Bash script for OAR batch submission of VAE notebook
SH2 SLURM batch script
Bash script for SLURM batch submission of VAE notebooks
ACTF1 Activation functions
Some activation functions, with their derivatives.
NP1 A short introduction to Numpy
Numpy is an essential tool for the Scientific Python.

Installation

A procedure for configuring and starting Jupyter is available in the Wiki.

Licence


[en] Attribution - NonCommercial - ShareAlike 4.0 International (CC BY-NC-SA 4.0)
[Fr] Attribution - Pas d’Utilisation Commerciale - Partage dans les Mêmes Conditions 4.0 International
See License.
See Disclaimer.