Newer
Older
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<img width=\"800px\" src=\"fidle/img/00-Fidle-header-01.svg\"></img>\n",
"\n",
"# Available notebooks"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<!-- INDEX_BEGIN -->\n",
"[[NP1] - A short introduction to Numpy](Prerequisites/Numpy.ipynb) \n",
" Numpy is an essential tool for the Scientific Python. \n",
"[[LINR1] - Linear regression with direct resolution](LinearReg/01-Linear-Regression.ipynb) \n",
" Direct determination of linear regression \n",
"[[GRAD1] - Linear regression with gradient descent](LinearReg/02-Gradient-descent.ipynb) \n",
" An example of gradient descent in the simple case of a linear regression. \n",
"[[POLR1] - Complexity Syndrome](LinearReg/03-Polynomial-Regression.ipynb) \n",
" Illustration of the problem of complexity with the polynomial regression \n",
"[[LOGR1] - Logistic regression, in pure Tensorflow](LinearReg/04-Logistic-Regression.ipynb) \n",
" Logistic Regression with Mini-Batch Gradient Descent using pure TensorFlow. \n",
"[[MNIST1] - Simple classification with DNN](MNIST/01-DNN-MNIST.ipynb) \n",
" Example of classification with a fully connected neural network \n",
"[[BHP1] - Regression with a Dense Network (DNN)](BHPD/01-DNN-Regression.ipynb) \n",
" A Simple regression with a Dense Neural Network (DNN) - BHPD dataset \n",
"[[BHP2] - Regression with a Dense Network (DNN) - Advanced code](BHPD/02-DNN-Regression-Premium.ipynb) \n",
" More advanced example of DNN network code - BHPD dataset \n",
"[[GTS1] - CNN with GTSRB dataset - Data analysis and preparation](GTSRB/01-Preparation-of-data.ipynb) \n",
" Episode 1: Data analysis and creation of a usable dataset \n",
"[[GTS2] - CNN with GTSRB dataset - First convolutions](GTSRB/02-First-convolutions.ipynb) \n",
" Episode 2 : First convolutions and first results \n",
"[[GTS3] - CNN with GTSRB dataset - Monitoring ](GTSRB/03-Tracking-and-visualizing.ipynb) \n",
" Episode 3: Monitoring and analysing training, managing checkpoints \n",
"[[GTS4] - CNN with GTSRB dataset - Data augmentation ](GTSRB/04-Data-augmentation.ipynb) \n",
" Episode 4: Improving the results with data augmentation \n",
"[[GTS5] - CNN with GTSRB dataset - Full convolutions ](GTSRB/05-Full-convolutions.ipynb) \n",
" Episode 5: A lot of models, a lot of datasets and a lot of results. \n",
"[[GTS6] - CNN with GTSRB dataset - Full convolutions as a batch](GTSRB/06-Full-convolutions-batch.ipynb) \n",
" Episode 6 : Run Full convolution notebook as a batch \n",
"[[GTS7] - Full convolutions Report](GTSRB/07-Full-convolutions-reports.ipynb) \n",
" Displaying the reports of the different jobs \n",
"[[TSB1] - Tensorboard with/from Jupyter ](GTSRB/99-Scripts-Tensorboard.ipynb) \n",
" 4 ways to use Tensorboard from the Jupyter environment \n",
"[[IMDB1] - Text embedding with IMDB](IMDB/01-Embedding-Keras.ipynb) \n",
" A very classical example of word embedding for text classification (sentiment analysis) \n",
"[[IMDB2] - Text embedding with IMDB - Reloaded](IMDB/02-Prediction.ipynb) \n",
" Example of reusing a previously saved model \n",
"[[IMDB3] - Text embedding/LSTM model with IMDB](IMDB/03-LSTM-Keras.ipynb) \n",
" Still the same problem, but with a network combining embedding and LSTM \n",
"[[VAE1] - Variational AutoEncoder (VAE) with MNIST](VAE/01-VAE-with-MNIST.ipynb) \n",
" First generative network experience with the MNIST dataset \n",
"[[VAE2] - Variational AutoEncoder (VAE) with MNIST - Analysis](VAE/02-VAE-with-MNIST-post.ipynb) \n",
" Use of the previously trained model, analysis of the results \n",
"[[VAE3] - About the CelebA dataset](VAE/03-Prepare-CelebA.ipynb) \n",
" New VAE experience, but with a larger and more fun dataset \n",
"[[VAE4] - Preparation of the CelebA dataset](VAE/04-Prepare-CelebA-batch.ipynb) \n",
" Preparation of a clustered dataset, batchable \n",
"[[VAE5] - Checking the clustered CelebA dataset](VAE/05-Check-CelebA.ipynb) \n",
" Verification of prepared data from CelebA dataset \n",
"[[VAE6] - Variational AutoEncoder (VAE) with CelebA (small)](VAE/06-VAE-with-CelebA-s.ipynb) \n",
" VAE with a more fun and realistic dataset - small resolution and batchable \n",
"[[VAE7] - Variational AutoEncoder (VAE) with CelebA (medium)](VAE/07-VAE-with-CelebA-m.ipynb) \n",
" VAE with a more fun and realistic dataset - medium resolution and batchable \n",
"[[VAE12] - Variational AutoEncoder (VAE) with CelebA - Analysis](VAE/12-VAE-withCelebA-post.ipynb) \n",
" Use of the previously trained model with CelebA, analysis of the results \n",
"[[BASH1] - OAR batch script](VAE/batch-oar.sh) \n",
" Bash script for OAR batch submission of a notebook \n",
"[[BASH2] - SLURM batch script](VAE/batch-slurm.sh) \n",
" Bash script for SLURM batch submission of a notebook \n",
"<!-- INDEX_END -->"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"---\n",
"<img width=\"80px\" src=\"fidle/img/00-Fidle-logo-01.svg\"></img>"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.6"
}
},
"nbformat": 4,
"nbformat_minor": 4
}