Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • daconcea/fidle
  • bossardl/fidle
  • Julie.Remenant/fidle
  • abijolao/fidle
  • monsimau/fidle
  • karkars/fidle
  • guilgautier/fidle
  • cailletr/fidle
  • talks/fidle
9 results
Show changes
Commits on Source (88)
Showing
with 7152 additions and 616 deletions
......@@ -2,5 +2,10 @@
*/.ipynb_checkpoints/*
__pycache__
*/__pycache__/*
/run/**
*/data/*
run/
GTSRB/data
IMDB/data
MNIST/data
VAE/data
BHPD/data/*
!BHPD/data/BostonHousing.csv
%% Cell type:markdown id: tags:
![header1](../fidle/img/00-Fidle-header-01.png)
Linear regression with direct resolution
========================================
An example of direct linear regression.
## Objectives :
- Just one, the illustration of a direct resolution :-)
Equation : $ Y = X.\theta + N$
Where N is a noise vector
and $\theta = (a,b)$ a vector as y = a.x + b
%% Cell type:markdown id: tags:
## Step 1 - Import and init
%% Cell type:code id: tags:
``` python
import numpy as np
import math
import matplotlib
import matplotlib.pyplot as plt
import sys
sys.path.append('..')
import fidle.pwk as ooo
ooo.init()
```
%% Output
FIDLE 2020 - Practical Work Module
Version : 0.2.9
Run time : Tuesday 18 February 2020, 16:02:39
TensorFlow version : 2.0.0
Keras version : 2.2.4-tf
%% Cell type:markdown id: tags:
## Step 2 - Retrieve a set of points
%% Cell type:code id: tags:
``` python
# ---- Paramètres
nb = 100 # Nombre de points
xmin = 0 # Distribution / x
xmax = 10
a = 4 # Distribution / y
b = 2 # y= a.x + b (+ bruit)
noise = 7 # bruit
theta = np.array([[a],[b]])
# ---- Vecteur X (1,x) x nb
# la premiere colonne est a 1 afin que X.theta <=> 1.b + x.a
Xc1 = np.ones((nb,1))
Xc2 = np.random.uniform(xmin,xmax,(nb,1))
X = np.c_[ Xc1, Xc2 ]
# ---- Noise
# N = np.random.uniform(-noise,noise,(nb,1))
N = noise * np.random.normal(0,1,(nb,1))
# ---- Vecteur Y
Y = (X @ theta) + N
# print("X:\n",X,"\nY:\n ",Y)
```
%% Cell type:markdown id: tags:
### Show it
%% Cell type:code id: tags:
``` python
width = 12
height = 6
fig, ax = plt.subplots()
fig.set_size_inches(width,height)
ax.plot(X[:,1], Y, ".")
ax.tick_params(axis='both', which='both', bottom=False, left=False, labelbottom=False, labelleft=False)
ax.set_xlabel('x axis')
ax.set_ylabel('y axis')
plt.show()
```
%% Output
%% Cell type:markdown id: tags:
## Step 3 - Direct calculation of the normal equation
We'll try to find an optimal value of $\theta$, minimizing a cost function.
The cost function, classically used in the case of linear regressions, is the **root mean square error** (racine carré de l'erreur quadratique moyenne):
$RMSE(X,h_\theta)=\sqrt{\frac1n\sum_{i=1}^n\left[h_\theta(X^{(i)})-Y^{(i)}\right]^2}$
With the simplified variant : $MSE(X,h_\theta)=\frac1n\sum_{i=1}^n\left[h_\theta(X^{(i)})-Y^{(i)}\right]^2$
The optimal value of regression is : $ \hat{ \theta } =( X^{-T} .X)^{-1}.X^{-T}.Y$
Démontstration : https://eli.thegreenplace.net/2014/derivation-of-the-normal-equation-for-linear-regression
%% Cell type:code id: tags:
``` python
theta_hat = np.linalg.inv(X.T @ X) @ X.T @ Y
print("Theta :\n",theta,"\n\ntheta hat :\n",theta_hat)
```
%% Output
Theta :
[[4]
[2]]
theta hat :
[[6.81242007]
[1.56836316]]
%% Cell type:markdown id: tags:
### Show it
%% Cell type:code id: tags:
``` python
Xd = np.array([[1,xmin], [1,xmax]])
Yd = Xd @ theta_hat
fig, ax = plt.subplots()
fig.set_size_inches(width,height)
ax.plot(X[:,1], Y, ".")
ax.plot(Xd[:,1], Yd, "-")
ax.tick_params(axis='both', which='both', bottom=False, left=False, labelbottom=False, labelleft=False)
ax.set_xlabel('x axis')
ax.set_ylabel('y axis')
plt.show()
```
%% Output
%% Cell type:markdown id: tags:
---
![](../fidle/img/00-Fidle-logo-01_s.png)
source diff could not be displayed: it is too large. Options to address this: view the blob.
%% Cell type:markdown id: tags:
![header1](../fidle/img/00-Fidle-header-01.png)
# Polynomial regression
An example of polynomial regression.
## Objectives :
- Just one, the illustration of a polynomial regression :-)
We are looking for a polynomial function to approximate the observed series :
$ y = a_n\cdot x^n + \dots + a_i\cdot x^i + \dots + a_1\cdot x + b $
## Step 1 - Import and init
%% Cell type:code id: tags:
``` python
import numpy as np
import math
import random
import matplotlib
import matplotlib.pyplot as plt
import sys
sys.path.append('..')
import fidle.pwk as ooo
ooo.init()
```
%% Output
FIDLE 2020 - Practical Work Module
Version : 0.2.9
Run time : Tuesday 18 February 2020, 17:23:05
TensorFlow version : 2.0.0
Keras version : 2.2.4-tf
%% Cell type:markdown id: tags:
## Step 2 - Preparation of learning data :
%% Cell type:code id: tags:
``` python
# ---- Parameters
n = 100
xob_min = -5
xob_max = 5
deg = 7
a_min = -2
a_max = 2
noise = 2000
# ---- Train data
# X,Y : data
# X_norm,Y_norm : normalized data
X = np.random.uniform(xob_min,xob_max,(n,1))
# N = np.random.uniform(-noise,noise,(n,1))
N = noise * np.random.normal(0,1,(n,1))
a = np.random.uniform(a_min,a_max, (deg,))
fy = np.poly1d( a )
Y = fy(X) + N
# ---- Data normalization
#
X_norm = (X - X.mean(axis=0)) / X.std(axis=0)
Y_norm = (Y - Y.mean(axis=0)) / Y.std(axis=0)
# ---- Data visualization
width = 12
height = 6
nb_viz = min(2000,n)
def vector_infos(name,V):
m=V.mean(axis=0).item()
s=V.std(axis=0).item()
print("{:8} : mean={:+12.4f} std={:+12.4f} min={:+12.4f} max={:+12.4f}".format(name,m,s,V.min(),V.max()))
print("Nombre de points : {} a={} deg={} bruit={}".format(n,a,deg,noise))
ooo.display_md('#### Before normalization :')
print("\nDonnées d'aprentissage brute :")
print("({} points visibles sur {})".format(nb_viz,n))
plt.figure(figsize=(width, height))
plt.plot(X[:nb_viz], Y[:nb_viz], '.')
plt.tick_params(axis='both', which='both', bottom=False, left=False, labelbottom=False, labelleft=False)
plt.xlabel('x axis')
plt.ylabel('y axis')
plt.show()
vector_infos('X',X)
vector_infos('Y',Y)
ooo.display_md('#### After normalization :')
print("\nDonnées d'aprentissage normalisées :")
print("({} points visibles sur {})".format(nb_viz,n))
plt.figure(figsize=(width, height))
plt.plot(X_norm[:nb_viz], Y_norm[:nb_viz], '.')
plt.tick_params(axis='both', which='both', bottom=False, left=False, labelbottom=False, labelleft=False)
plt.xlabel('x axis')
plt.ylabel('y axis')
plt.show()
vector_infos('X_norm',X_norm)
vector_infos('Y_norm',Y_norm)
```
%% Output
Nombre de points : 100 a=[-1.40023862 1.64009905 1.89987647 1.24972783 1.17765272 1.90935391
1.11259327] deg=7 bruit=2000
#### Before normalization :
Données d'aprentissage brute :
(100 points visibles sur 100)
X : mean= +0.2539 std= +2.9283 min= -4.9332 max= +4.9177
Y : mean= -2914.8537 std= +5532.3607 min= -23848.6013 max= +5139.0627
#### After normalization :
Données d'aprentissage normalisées :
(100 points visibles sur 100)
X_norm : mean= +0.0000 std= +1.0000 min= -1.7714 max= +1.5927
Y_norm : mean= -0.0000 std= +1.0000 min= -3.7839 max= +1.4558
%% Cell type:markdown id: tags:
## Step 3 - Polynomial regression with NumPy
### 3.1 - Underfitting
%% Cell type:code id: tags:
``` python
def draw_reg(X_norm, Y_norm, x_hat,fy_hat, size):
plt.figure(figsize=size)
plt.plot(X_norm, Y_norm, '.')
x_hat = np.linspace(X_norm.min(), X_norm.max(), 100)
plt.plot(x_hat, fy_hat(x_hat))
plt.tick_params(axis='both', which='both', bottom=False, left=False, labelbottom=False, labelleft=False)
plt.xlabel('x axis')
plt.ylabel('y axis')
plt.show()
```
%% Cell type:code id: tags:
``` python
reg_deg=1
a_hat = np.polyfit(X_norm.reshape(-1,), Y_norm.reshape(-1,), reg_deg)
fy_hat = np.poly1d( a_hat )
print("Nombre de degrés : {} a_hat={}".format(reg_deg, a_hat))
draw_reg(X_norm[:nb_viz],Y_norm[:nb_viz], x_hat,fy_hat, (width,height))
```
%% Output
Nombre de degrés : 1 a_hat=[ 2.15635737e-01 -1.26046371e-16]
%% Cell type:markdown id: tags:
### 3.2 - Good fitting
%% Cell type:code id: tags:
``` python
reg_deg=5
a_hat = np.polyfit(X_norm.reshape(-1,), Y_norm.reshape(-1,), reg_deg)
fy_hat = np.poly1d( a_hat )
print("Nombre de degrés : {} a_hat={}".format(reg_deg, a_hat))
draw_reg(X_norm[:nb_viz],Y_norm[:nb_viz], x_hat,fy_hat, (width,height))
```
%% Output
Nombre de degrés : 5 a_hat=[ 0.09676506 -0.49102546 -0.19674074 0.41539174 -0.00888844 0.51187173]
%% Cell type:markdown id: tags:
### 3.3 - Overfitting
%% Cell type:code id: tags:
``` python
reg_deg=24
a_hat = np.polyfit(X_norm.reshape(-1,), Y_norm.reshape(-1,), reg_deg)
fy_hat = np.poly1d( a_hat )
print("Nombre de degrés : {} a_hat={}".format(reg_deg, a_hat))
draw_reg(X_norm[:nb_viz],Y_norm[:nb_viz], x_hat,fy_hat, (width,height))
```
%% Output
Nombre de degrés : 24 a_hat=[-3.39583761e-01 3.66524802e+00 1.35968152e+01 -5.33709389e+01
-1.54708597e+02 3.43661072e+02 8.76139324e+02 -1.29075459e+03
-2.93637308e+03 3.13537269e+03 6.25956959e+03 -5.14495063e+03
-8.72124478e+03 5.75688179e+03 7.93008239e+03 -4.30734159e+03
-4.57884398e+03 2.04404969e+03 1.58025869e+03 -5.55090657e+02
-2.89374022e+02 7.05282858e+01 2.12619645e+01 -2.67799255e+00
3.95718335e-01]
%% Cell type:markdown id: tags:
---
![](../fidle/img/00-Fidle-logo-01_s.png)
source diff could not be displayed: it is too large. Options to address this: view the blob.
source diff could not be displayed: it is too large. Options to address this: view the blob.
%% Cell type:markdown id: tags:
![header1](../fidle/img/00-Fidle-header-01.png)
Deep Neural Network (DNN) - BHPD dataset
========================================
A very simple and classic example of **regression** :
## Objectives :
Predicts **housing prices** from a set of house features.
The **[Boston Housing Dataset](https://www.cs.toronto.edu/~delve/data/boston/bostonDetail.html)** consists of price of houses in various places in Boston.
Alongside with price, the dataset also provide information such as Crime, areas of non-retail business in the town,
age of people who own the house and many other attributes...
What we're going to do:
- Retrieve data
- Preparing the data
- Build a model
- Train the model
- Evaluate the result
%% Cell type:markdown id: tags:
## Step 1 - Import and init
%% Cell type:code id: tags:
``` python
import tensorflow as tf
from tensorflow import keras
import numpy as np
import matplotlib.pyplot as plt
import pandas as pd
import os,sys
from IPython.display import display, Markdown
from importlib import reload
sys.path.append('..')
import fidle.pwk as ooo
ooo.init()
os.makedirs('./run/models', mode=0o750, exist_ok=True)
```
%% Output
FIDLE 2020 - Practical Work Module
Version : 0.2.9
Run time : Tuesday 18 February 2020, 14:42:02
TensorFlow version : 2.0.0
Keras version : 2.2.4-tf
%% Cell type:markdown id: tags:
## Step 2 - Retrieve data
### 2.1 - Option 1 : From Keras
Boston housing is a famous historic dataset, so we can get it directly from [Keras datasets](https://www.tensorflow.org/api_docs/python/tf/keras/datasets)
%% Cell type:raw id: tags:
(x_train, y_train), (x_test, y_test) = keras.datasets.boston_housing.load_data(test_split=0.2, seed=113)
%% Cell type:markdown id: tags:
### 2.2 - Option 2 : From a csv file
More fun !
%% Cell type:code id: tags:
``` python
data = pd.read_csv('./data/BostonHousing.csv', header=0)
display(data.head(5).style.format("{0:.2f}"))
print('Données manquantes : ',data.isna().sum().sum(), ' Shape is : ', data.shape)
```
%% Output
Données manquantes : 0 Shape is : (506, 14)
%% Cell type:markdown id: tags:
## Step 3 - Preparing the data
### 3.1 - Split data
We will use 70% of the data for training and 30% for validation.
x will be input data and y the expected output
%% Cell type:code id: tags:
``` python
# ---- Split => train, test
#
data_train = data.sample(frac=0.7, axis=0)
data_test = data.drop(data_train.index)
# ---- Split => x,y (medv is price)
#
x_train = data_train.drop('medv', axis=1)
y_train = data_train['medv']
x_test = data_test.drop('medv', axis=1)
y_test = data_test['medv']
print('Original data shape was : ',data.shape)
print('x_train : ',x_train.shape, 'y_train : ',y_train.shape)
print('x_test : ',x_test.shape, 'y_test : ',y_test.shape)
```
%% Output
Original data shape was : (506, 14)
x_train : (354, 13) y_train : (354,)
x_test : (152, 13) y_test : (152,)
%% Cell type:markdown id: tags:
### 3.2 - Data normalization
**Note :**
- All input data must be normalized, train and test.
- To do this we will **subtract the mean** and **divide by the standard deviation**.
- But test data should not be used in any way, even for normalization.
- The mean and the standard deviation will therefore only be calculated with the train data.
%% Cell type:code id: tags:
``` python
display(x_train.describe().style.format("{0:.2f}").set_caption("Before normalization :"))
mean = x_train.mean()
std = x_train.std()
x_train = (x_train - mean) / std
x_test = (x_test - mean) / std
display(x_train.describe().style.format("{0:.2f}").set_caption("After normalization :"))
x_train, y_train = np.array(x_train), np.array(y_train)
x_test, y_test = np.array(x_test), np.array(y_test)
```
%% Output
%% Cell type:markdown id: tags:
## Step 4 - Build a model
About informations about :
- [Optimizer](https://www.tensorflow.org/api_docs/python/tf/keras/optimizers)
- [Activation](https://www.tensorflow.org/api_docs/python/tf/keras/activations)
- [Loss](https://www.tensorflow.org/api_docs/python/tf/keras/losses)
- [Metrics](https://www.tensorflow.org/api_docs/python/tf/keras/metrics)
%% Cell type:code id: tags:
``` python
def get_model_v1(shape):
model = keras.models.Sequential()
model.add(keras.layers.Input(shape, name="InputLayer"))
model.add(keras.layers.Dense(64, activation='relu', name='Dense_n1'))
model.add(keras.layers.Dense(64, activation='relu', name='Dense_n2'))
model.add(keras.layers.Dense(1, name='Output'))
model.compile(optimizer = 'rmsprop',
loss = 'mse',
metrics = ['mae', 'mse'] )
return model
```
%% Cell type:markdown id: tags:
## Step 5 - Train the model
### 5.1 - Get it
%% Cell type:code id: tags:
``` python
model=get_model_v1( (13,) )
model.summary()
keras.utils.plot_model( model, to_file='./run/model.png', show_shapes=True, show_layer_names=True, dpi=96)
```
%% Output
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
Dense_n1 (Dense) (None, 64) 896
_________________________________________________________________
Dense_n2 (Dense) (None, 64) 4160
_________________________________________________________________
Output (Dense) (None, 1) 65
=================================================================
Total params: 5,121
Trainable params: 5,121
Non-trainable params: 0
_________________________________________________________________
<IPython.core.display.Image object>
%% Cell type:markdown id: tags:
### 5.2 - Train it
%% Cell type:code id: tags:
``` python
history = model.fit(x_train,
y_train,
epochs = 100,
batch_size = 10,
verbose = 1,
validation_data = (x_test, y_test))
```
%% Output
Train on 354 samples, validate on 152 samples
Epoch 1/100
354/354 [==============================] - 1s 2ms/sample - loss: 414.5603 - mae: 18.2577 - mse: 414.5602 - val_loss: 266.3728 - val_mae: 13.9913 - val_mse: 266.3728
Epoch 2/100
354/354 [==============================] - 0s 190us/sample - loss: 165.4507 - mae: 10.4618 - mse: 165.4507 - val_loss: 74.4125 - val_mae: 6.0372 - val_mse: 74.4125
Epoch 3/100
354/354 [==============================] - 0s 187us/sample - loss: 54.2313 - mae: 5.3763 - mse: 54.2313 - val_loss: 47.0203 - val_mae: 4.7399 - val_mse: 47.0203
Epoch 4/100
354/354 [==============================] - 0s 166us/sample - loss: 32.3303 - mae: 4.2632 - mse: 32.3303 - val_loss: 38.0120 - val_mae: 4.2484 - val_mse: 38.0120
Epoch 5/100
354/354 [==============================] - 0s 153us/sample - loss: 25.3763 - mae: 3.7745 - mse: 25.3763 - val_loss: 32.4707 - val_mae: 3.8465 - val_mse: 32.4707
Epoch 6/100
354/354 [==============================] - 0s 153us/sample - loss: 22.2331 - mae: 3.4720 - mse: 22.2331 - val_loss: 29.6142 - val_mae: 3.4844 - val_mse: 29.6142
Epoch 7/100
354/354 [==============================] - 0s 154us/sample - loss: 19.7834 - mae: 3.2245 - mse: 19.7834 - val_loss: 27.1649 - val_mae: 3.5465 - val_mse: 27.1649
Epoch 8/100
354/354 [==============================] - 0s 155us/sample - loss: 18.0991 - mae: 3.0669 - mse: 18.0991 - val_loss: 26.0093 - val_mae: 3.5617 - val_mse: 26.0093
Epoch 9/100
354/354 [==============================] - 0s 161us/sample - loss: 16.9247 - mae: 2.9184 - mse: 16.9247 - val_loss: 23.2549 - val_mae: 3.3243 - val_mse: 23.2549
Epoch 10/100
354/354 [==============================] - 0s 150us/sample - loss: 16.0827 - mae: 2.8116 - mse: 16.0827 - val_loss: 21.1365 - val_mae: 3.0248 - val_mse: 21.1365
Epoch 11/100
354/354 [==============================] - 0s 170us/sample - loss: 15.0334 - mae: 2.7214 - mse: 15.0334 - val_loss: 20.0163 - val_mae: 2.9800 - val_mse: 20.0163
Epoch 12/100
354/354 [==============================] - 0s 180us/sample - loss: 14.4011 - mae: 2.6949 - mse: 14.4011 - val_loss: 19.8958 - val_mae: 2.9262 - val_mse: 19.8958
Epoch 13/100
354/354 [==============================] - 0s 184us/sample - loss: 13.9168 - mae: 2.5674 - mse: 13.9168 - val_loss: 18.5729 - val_mae: 2.7302 - val_mse: 18.5729
Epoch 14/100
354/354 [==============================] - 0s 161us/sample - loss: 13.5575 - mae: 2.5442 - mse: 13.5575 - val_loss: 17.8812 - val_mae: 2.6748 - val_mse: 17.8812
Epoch 15/100
354/354 [==============================] - 0s 166us/sample - loss: 12.8689 - mae: 2.4779 - mse: 12.8689 - val_loss: 18.9649 - val_mae: 2.7560 - val_mse: 18.9649
Epoch 16/100
354/354 [==============================] - 0s 159us/sample - loss: 12.6470 - mae: 2.4670 - mse: 12.6470 - val_loss: 16.5834 - val_mae: 2.6016 - val_mse: 16.5834
Epoch 17/100
354/354 [==============================] - 0s 159us/sample - loss: 12.3566 - mae: 2.4280 - mse: 12.3566 - val_loss: 16.7371 - val_mae: 2.6670 - val_mse: 16.7371
Epoch 18/100
354/354 [==============================] - 0s 158us/sample - loss: 12.3328 - mae: 2.4060 - mse: 12.3328 - val_loss: 16.3754 - val_mae: 2.6027 - val_mse: 16.3754
Epoch 19/100
354/354 [==============================] - 0s 152us/sample - loss: 11.8357 - mae: 2.3106 - mse: 11.8357 - val_loss: 16.1015 - val_mae: 2.6255 - val_mse: 16.1015
Epoch 20/100
354/354 [==============================] - 0s 163us/sample - loss: 11.6722 - mae: 2.3482 - mse: 11.6722 - val_loss: 16.1405 - val_mae: 2.6889 - val_mse: 16.1405
Epoch 21/100
354/354 [==============================] - 0s 175us/sample - loss: 11.2774 - mae: 2.3344 - mse: 11.2774 - val_loss: 15.2110 - val_mae: 2.5038 - val_mse: 15.2110
Epoch 22/100
354/354 [==============================] - 0s 180us/sample - loss: 11.2491 - mae: 2.3055 - mse: 11.2491 - val_loss: 15.4745 - val_mae: 2.4494 - val_mse: 15.4744
Epoch 23/100
354/354 [==============================] - 0s 187us/sample - loss: 10.9102 - mae: 2.2171 - mse: 10.9102 - val_loss: 15.1145 - val_mae: 2.4282 - val_mse: 15.1145
Epoch 24/100
354/354 [==============================] - 0s 168us/sample - loss: 10.7952 - mae: 2.2533 - mse: 10.7952 - val_loss: 14.3789 - val_mae: 2.3683 - val_mse: 14.3789
Epoch 25/100
354/354 [==============================] - 0s 171us/sample - loss: 10.7250 - mae: 2.2489 - mse: 10.7250 - val_loss: 15.1102 - val_mae: 2.3422 - val_mse: 15.1102
Epoch 26/100
354/354 [==============================] - 0s 158us/sample - loss: 10.4010 - mae: 2.1702 - mse: 10.4010 - val_loss: 14.3260 - val_mae: 2.3176 - val_mse: 14.3260
Epoch 27/100
354/354 [==============================] - 0s 149us/sample - loss: 10.1442 - mae: 2.1797 - mse: 10.1442 - val_loss: 13.6694 - val_mae: 2.3864 - val_mse: 13.6694
Epoch 28/100
354/354 [==============================] - 0s 168us/sample - loss: 10.1391 - mae: 2.1809 - mse: 10.1391 - val_loss: 14.0177 - val_mae: 2.3467 - val_mse: 14.0177
Epoch 29/100
354/354 [==============================] - 0s 149us/sample - loss: 9.9119 - mae: 2.1267 - mse: 9.9119 - val_loss: 14.0739 - val_mae: 2.4617 - val_mse: 14.0739
Epoch 30/100
354/354 [==============================] - 0s 164us/sample - loss: 10.0176 - mae: 2.1669 - mse: 10.0176 - val_loss: 13.5116 - val_mae: 2.3158 - val_mse: 13.5116
Epoch 31/100
354/354 [==============================] - 0s 189us/sample - loss: 9.8259 - mae: 2.1407 - mse: 9.8259 - val_loss: 13.7364 - val_mae: 2.3531 - val_mse: 13.7364
Epoch 32/100
354/354 [==============================] - 0s 178us/sample - loss: 9.4495 - mae: 2.0922 - mse: 9.4495 - val_loss: 14.1936 - val_mae: 2.3887 - val_mse: 14.1936
Epoch 33/100
354/354 [==============================] - 0s 164us/sample - loss: 9.6721 - mae: 2.0870 - mse: 9.6721 - val_loss: 13.4267 - val_mae: 2.3508 - val_mse: 13.4267
Epoch 34/100
354/354 [==============================] - 0s 167us/sample - loss: 9.1042 - mae: 2.0644 - mse: 9.1042 - val_loss: 13.3821 - val_mae: 2.4709 - val_mse: 13.3821
Epoch 35/100
354/354 [==============================] - 0s 155us/sample - loss: 9.0129 - mae: 2.0482 - mse: 9.0129 - val_loss: 14.2184 - val_mae: 2.2754 - val_mse: 14.2184
Epoch 36/100
354/354 [==============================] - 0s 160us/sample - loss: 9.2470 - mae: 2.0661 - mse: 9.2470 - val_loss: 14.3466 - val_mae: 2.5561 - val_mse: 14.3466
Epoch 37/100
354/354 [==============================] - 0s 169us/sample - loss: 9.1695 - mae: 2.0766 - mse: 9.1695 - val_loss: 13.3818 - val_mae: 2.2373 - val_mse: 13.3818
Epoch 38/100
354/354 [==============================] - 0s 165us/sample - loss: 9.1663 - mae: 2.0617 - mse: 9.1663 - val_loss: 14.7461 - val_mae: 2.5061 - val_mse: 14.7461
Epoch 39/100
354/354 [==============================] - 0s 159us/sample - loss: 8.7273 - mae: 2.0208 - mse: 8.7273 - val_loss: 12.5890 - val_mae: 2.3037 - val_mse: 12.5890
Epoch 40/100
354/354 [==============================] - 0s 166us/sample - loss: 8.9038 - mae: 2.0352 - mse: 8.9038 - val_loss: 12.9754 - val_mae: 2.2079 - val_mse: 12.9754
Epoch 41/100
354/354 [==============================] - 0s 153us/sample - loss: 8.6155 - mae: 2.0267 - mse: 8.6155 - val_loss: 13.9239 - val_mae: 2.3525 - val_mse: 13.9239
Epoch 42/100
354/354 [==============================] - 0s 163us/sample - loss: 8.5479 - mae: 2.0170 - mse: 8.5479 - val_loss: 13.6362 - val_mae: 2.2694 - val_mse: 13.6362
Epoch 43/100
354/354 [==============================] - 0s 165us/sample - loss: 8.7087 - mae: 2.0062 - mse: 8.7087 - val_loss: 13.1138 - val_mae: 2.2386 - val_mse: 13.1138
Epoch 44/100
354/354 [==============================] - 0s 160us/sample - loss: 8.3942 - mae: 1.9622 - mse: 8.3942 - val_loss: 12.3461 - val_mae: 2.2337 - val_mse: 12.3461
Epoch 45/100
354/354 [==============================] - 0s 168us/sample - loss: 8.4101 - mae: 2.0098 - mse: 8.4101 - val_loss: 13.2116 - val_mae: 2.2682 - val_mse: 13.2116
Epoch 46/100
354/354 [==============================] - 0s 156us/sample - loss: 8.3264 - mae: 1.9483 - mse: 8.3264 - val_loss: 12.5519 - val_mae: 2.4063 - val_mse: 12.5519
Epoch 47/100
354/354 [==============================] - 0s 158us/sample - loss: 8.1445 - mae: 1.9549 - mse: 8.1445 - val_loss: 12.1838 - val_mae: 2.2591 - val_mse: 12.1838
Epoch 48/100
354/354 [==============================] - 0s 156us/sample - loss: 8.0389 - mae: 1.9304 - mse: 8.0389 - val_loss: 12.6978 - val_mae: 2.1907 - val_mse: 12.6978
Epoch 49/100
354/354 [==============================] - 0s 164us/sample - loss: 8.0705 - mae: 1.9493 - mse: 8.0705 - val_loss: 12.4833 - val_mae: 2.4720 - val_mse: 12.4833
Epoch 50/100
354/354 [==============================] - 0s 158us/sample - loss: 8.1872 - mae: 1.9630 - mse: 8.1872 - val_loss: 12.0043 - val_mae: 2.2610 - val_mse: 12.0043
Epoch 51/100
354/354 [==============================] - 0s 158us/sample - loss: 8.0357 - mae: 1.8946 - mse: 8.0357 - val_loss: 11.3982 - val_mae: 2.1770 - val_mse: 11.3982
Epoch 52/100
354/354 [==============================] - 0s 162us/sample - loss: 7.6882 - mae: 1.8951 - mse: 7.6882 - val_loss: 13.0714 - val_mae: 2.4109 - val_mse: 13.0714
Epoch 53/100
354/354 [==============================] - 0s 162us/sample - loss: 7.9639 - mae: 1.9103 - mse: 7.9639 - val_loss: 12.4297 - val_mae: 2.2996 - val_mse: 12.4297
Epoch 54/100
354/354 [==============================] - 0s 183us/sample - loss: 7.7929 - mae: 1.8971 - mse: 7.7929 - val_loss: 11.9751 - val_mae: 2.2491 - val_mse: 11.9751
Epoch 55/100
354/354 [==============================] - 0s 185us/sample - loss: 7.4411 - mae: 1.8631 - mse: 7.4411 - val_loss: 11.3761 - val_mae: 2.3416 - val_mse: 11.3761
Epoch 56/100
354/354 [==============================] - 0s 186us/sample - loss: 7.6105 - mae: 1.9111 - mse: 7.6105 - val_loss: 12.4939 - val_mae: 2.4095 - val_mse: 12.4939
Epoch 57/100
354/354 [==============================] - 0s 190us/sample - loss: 7.5013 - mae: 1.9146 - mse: 7.5013 - val_loss: 11.6668 - val_mae: 2.1468 - val_mse: 11.6668
Epoch 58/100
354/354 [==============================] - 0s 195us/sample - loss: 7.4096 - mae: 1.8515 - mse: 7.4096 - val_loss: 13.8000 - val_mae: 2.5222 - val_mse: 13.8000
Epoch 59/100
354/354 [==============================] - 0s 180us/sample - loss: 7.2263 - mae: 1.8241 - mse: 7.2263 - val_loss: 10.8964 - val_mae: 2.2130 - val_mse: 10.8964
Epoch 60/100
354/354 [==============================] - 0s 161us/sample - loss: 7.1773 - mae: 1.8526 - mse: 7.1773 - val_loss: 10.7862 - val_mae: 2.1088 - val_mse: 10.7862
Epoch 61/100
354/354 [==============================] - 0s 165us/sample - loss: 7.0812 - mae: 1.8308 - mse: 7.0812 - val_loss: 10.8147 - val_mae: 2.3209 - val_mse: 10.8147
Epoch 62/100
354/354 [==============================] - 0s 155us/sample - loss: 7.2235 - mae: 1.8367 - mse: 7.2235 - val_loss: 11.0399 - val_mae: 2.2583 - val_mse: 11.0399
Epoch 63/100
354/354 [==============================] - 0s 155us/sample - loss: 7.0341 - mae: 1.8172 - mse: 7.0341 - val_loss: 10.9894 - val_mae: 2.1429 - val_mse: 10.9894
Epoch 64/100
354/354 [==============================] - 0s 157us/sample - loss: 6.8729 - mae: 1.7492 - mse: 6.8729 - val_loss: 10.5465 - val_mae: 2.1532 - val_mse: 10.5465
Epoch 65/100
354/354 [==============================] - 0s 164us/sample - loss: 6.9345 - mae: 1.7837 - mse: 6.9345 - val_loss: 11.5379 - val_mae: 2.1963 - val_mse: 11.5379
Epoch 66/100
354/354 [==============================] - 0s 166us/sample - loss: 6.8218 - mae: 1.7714 - mse: 6.8218 - val_loss: 10.1486 - val_mae: 2.1617 - val_mse: 10.1486
Epoch 67/100
354/354 [==============================] - 0s 157us/sample - loss: 6.8711 - mae: 1.8045 - mse: 6.8711 - val_loss: 10.3196 - val_mae: 2.2297 - val_mse: 10.3196
Epoch 68/100
354/354 [==============================] - 0s 162us/sample - loss: 6.7281 - mae: 1.7762 - mse: 6.7281 - val_loss: 11.2361 - val_mae: 2.2046 - val_mse: 11.2361
Epoch 69/100
354/354 [==============================] - 0s 158us/sample - loss: 6.5518 - mae: 1.7292 - mse: 6.5518 - val_loss: 10.2378 - val_mae: 2.1494 - val_mse: 10.2378
Epoch 70/100
354/354 [==============================] - 0s 161us/sample - loss: 6.6489 - mae: 1.7383 - mse: 6.6489 - val_loss: 11.1613 - val_mae: 2.2212 - val_mse: 11.1613
Epoch 71/100
354/354 [==============================] - 0s 176us/sample - loss: 6.5827 - mae: 1.7564 - mse: 6.5827 - val_loss: 10.0177 - val_mae: 2.2440 - val_mse: 10.0177
Epoch 72/100
354/354 [==============================] - 0s 168us/sample - loss: 6.3411 - mae: 1.7463 - mse: 6.3411 - val_loss: 10.7929 - val_mae: 2.1946 - val_mse: 10.7929
Epoch 73/100
354/354 [==============================] - 0s 163us/sample - loss: 6.3621 - mae: 1.7466 - mse: 6.3621 - val_loss: 9.7344 - val_mae: 2.1441 - val_mse: 9.7344
Epoch 74/100
354/354 [==============================] - 0s 158us/sample - loss: 6.2298 - mae: 1.7411 - mse: 6.2298 - val_loss: 11.2495 - val_mae: 2.1948 - val_mse: 11.2495
Epoch 75/100
354/354 [==============================] - 0s 159us/sample - loss: 6.3037 - mae: 1.7169 - mse: 6.3037 - val_loss: 10.1339 - val_mae: 2.1716 - val_mse: 10.1339
Epoch 76/100
354/354 [==============================] - 0s 158us/sample - loss: 6.0780 - mae: 1.6686 - mse: 6.0780 - val_loss: 11.9975 - val_mae: 2.3317 - val_mse: 11.9975
Epoch 77/100
354/354 [==============================] - 0s 165us/sample - loss: 6.3311 - mae: 1.7082 - mse: 6.3311 - val_loss: 11.6433 - val_mae: 2.2756 - val_mse: 11.6433
Epoch 78/100
354/354 [==============================] - 0s 155us/sample - loss: 6.0620 - mae: 1.6765 - mse: 6.0620 - val_loss: 13.0159 - val_mae: 2.5073 - val_mse: 13.0159
Epoch 79/100
354/354 [==============================] - 0s 167us/sample - loss: 6.1819 - mae: 1.7157 - mse: 6.1819 - val_loss: 10.1000 - val_mae: 2.1462 - val_mse: 10.1000
Epoch 80/100
354/354 [==============================] - 0s 158us/sample - loss: 5.9085 - mae: 1.6720 - mse: 5.9085 - val_loss: 11.7867 - val_mae: 2.5045 - val_mse: 11.7866
Epoch 81/100
354/354 [==============================] - 0s 168us/sample - loss: 6.0201 - mae: 1.6678 - mse: 6.0201 - val_loss: 10.8789 - val_mae: 2.3031 - val_mse: 10.8789
Epoch 82/100
354/354 [==============================] - 0s 159us/sample - loss: 6.1278 - mae: 1.6799 - mse: 6.1278 - val_loss: 9.8114 - val_mae: 2.1048 - val_mse: 9.8114
Epoch 83/100
354/354 [==============================] - 0s 150us/sample - loss: 5.6372 - mae: 1.6280 - mse: 5.6372 - val_loss: 10.0971 - val_mae: 2.1464 - val_mse: 10.0971
Epoch 84/100
354/354 [==============================] - 0s 153us/sample - loss: 5.9587 - mae: 1.6421 - mse: 5.9587 - val_loss: 9.4731 - val_mae: 2.1915 - val_mse: 9.4731
Epoch 85/100
354/354 [==============================] - 0s 158us/sample - loss: 5.6189 - mae: 1.6223 - mse: 5.6189 - val_loss: 9.9788 - val_mae: 2.3332 - val_mse: 9.9788
Epoch 86/100
354/354 [==============================] - 0s 158us/sample - loss: 5.8193 - mae: 1.6930 - mse: 5.8193 - val_loss: 10.4070 - val_mae: 2.1490 - val_mse: 10.4070
Epoch 87/100
354/354 [==============================] - 0s 155us/sample - loss: 5.5919 - mae: 1.6152 - mse: 5.5919 - val_loss: 9.9985 - val_mae: 2.2546 - val_mse: 9.9985
Epoch 88/100
354/354 [==============================] - 0s 160us/sample - loss: 5.6652 - mae: 1.6246 - mse: 5.6652 - val_loss: 9.1506 - val_mae: 2.0642 - val_mse: 9.1506
Epoch 89/100
354/354 [==============================] - 0s 157us/sample - loss: 5.6349 - mae: 1.6108 - mse: 5.6349 - val_loss: 9.8522 - val_mae: 2.0813 - val_mse: 9.8522
Epoch 90/100
354/354 [==============================] - 0s 159us/sample - loss: 5.6165 - mae: 1.6449 - mse: 5.6165 - val_loss: 9.1553 - val_mae: 2.0421 - val_mse: 9.1553
Epoch 91/100
354/354 [==============================] - 0s 161us/sample - loss: 5.5416 - mae: 1.6153 - mse: 5.5416 - val_loss: 10.4231 - val_mae: 2.2880 - val_mse: 10.4231
Epoch 92/100
354/354 [==============================] - 0s 158us/sample - loss: 5.3909 - mae: 1.5863 - mse: 5.3909 - val_loss: 8.8087 - val_mae: 2.1022 - val_mse: 8.8087
Epoch 93/100
354/354 [==============================] - 0s 155us/sample - loss: 5.3540 - mae: 1.5986 - mse: 5.3540 - val_loss: 9.6963 - val_mae: 2.1931 - val_mse: 9.6963
Epoch 94/100
354/354 [==============================] - 0s 161us/sample - loss: 5.3198 - mae: 1.6074 - mse: 5.3198 - val_loss: 9.1875 - val_mae: 2.1917 - val_mse: 9.1875
Epoch 95/100
354/354 [==============================] - 0s 165us/sample - loss: 5.2299 - mae: 1.5638 - mse: 5.2299 - val_loss: 8.8746 - val_mae: 2.1273 - val_mse: 8.8746
Epoch 96/100
354/354 [==============================] - 0s 163us/sample - loss: 5.2789 - mae: 1.5651 - mse: 5.2789 - val_loss: 9.7351 - val_mae: 2.2359 - val_mse: 9.7351
Epoch 97/100
354/354 [==============================] - 0s 153us/sample - loss: 5.3399 - mae: 1.6002 - mse: 5.3399 - val_loss: 9.7185 - val_mae: 2.1080 - val_mse: 9.7185
Epoch 98/100
354/354 [==============================] - 0s 159us/sample - loss: 5.0072 - mae: 1.5055 - mse: 5.0072 - val_loss: 8.3621 - val_mae: 2.0586 - val_mse: 8.3621
Epoch 99/100
354/354 [==============================] - 0s 156us/sample - loss: 5.2596 - mae: 1.5557 - mse: 5.2596 - val_loss: 8.6406 - val_mae: 2.0527 - val_mse: 8.6406
Epoch 100/100
354/354 [==============================] - 0s 159us/sample - loss: 5.0983 - mae: 1.5543 - mse: 5.0983 - val_loss: 8.4836 - val_mae: 2.0234 - val_mse: 8.4836
%% Cell type:markdown id: tags:
## Step 6 - Evaluate
### 6.1 - Model evaluation
MAE = Mean Absolute Error (between the labels and predictions)
A mae equal to 3 represents an average error in prediction of $3k.
%% Cell type:code id: tags:
``` python
score = model.evaluate(x_test, y_test, verbose=0)
print('x_test / loss : {:5.4f}'.format(score[0]))
print('x_test / mae : {:5.4f}'.format(score[1]))
print('x_test / mse : {:5.4f}'.format(score[2]))
```
%% Output
x_test / loss : 8.4836
x_test / mae : 2.0234
x_test / mse : 8.4836
%% Cell type:markdown id: tags:
### 6.2 - Training history
What was the best result during our training ?
%% Cell type:code id: tags:
``` python
df=pd.DataFrame(data=history.history)
df.describe()
```
%% Output
loss mae mse val_loss val_mae val_mse
count 100.000000 100.000000 100.000000 100.000000 100.000000 100.000000
mean 15.144930 2.312168 15.144930 17.019036 2.582618 17.019036
std 43.707091 1.906713 43.707090 26.587745 1.288267 26.587746
min 5.007155 1.505515 5.007155 8.362053 2.023406 8.362053
25% 6.285225 1.716563 6.285225 10.419040 2.192718 10.419040
50% 8.037316 1.922454 8.037317 12.488579 2.301342 12.488580
75% 10.482029 2.189933 10.482029 14.470699 2.503943 14.470701
max 414.560260 18.257650 414.560242 266.372801 13.991282 266.372803
%% Cell type:code id: tags:
``` python
print("min( val_mae ) : {:.4f}".format( min(history.history["val_mae"]) ) )
```
%% Output
min( val_mae ) : 2.0234
%% Cell type:code id: tags:
``` python
ooo.plot_history(history, plot={'MSE' :['mse', 'val_mse'],
'MAE' :['mae', 'val_mae'],
'LOSS':['loss','val_loss']})
```
%% Output
%% Cell type:markdown id: tags:
## Step 7 - Make a prediction
%% Cell type:code id: tags:
``` python
my_data = [ 1.26425925, -0.48522739, 1.0436489 , -0.23112788, 1.37120745,
-2.14308942, 1.13489104, -1.06802005, 1.71189006, 1.57042287,
0.77859951, 0.14769795, 2.7585581 ]
real_price = 10.4
my_data=np.array(my_data).reshape(1,13)
```
%% Cell type:code id: tags:
``` python
predictions = model.predict( my_data )
print("Prédiction : {:.2f} K$".format(predictions[0][0]))
print("Reality : {:.2f} K$".format(real_price))
```
%% Output
Prédiction : 11.59 K$
Reality : 10.40 K$
%% Cell type:markdown id: tags:
---
![](../fidle/img/00-Fidle-logo-01_s.png)
%% Cell type:markdown id: tags:
Deep Neural Network (DNN) - BHPD dataset
========================================
---
Introduction au Deep Learning (IDLE) - S. Arias, E. Maldonado, JL. Parouty - CNRS/SARI/DEVLOG - 2020
## A very simple example of **regression** (Premium edition):
Objective is to predicts **housing prices** from a set of house features.
The **[Boston Housing Dataset](https://www.cs.toronto.edu/~delve/data/boston/bostonDetail.html)** consists of price of houses in various places in Boston.
Alongside with price, the dataset also provide information such as Crime, areas of non-retail business in the town,
age of people who own the house and many other attributes...
What we're going to do:
- (Retrieve data)
- (Preparing the data)
- (Build a model)
- Train and save the model
- Restore saved model
- Evaluate the model
- Make some predictions
%% Cell type:markdown id: tags:
## Step 1 - Import and init
%% Cell type:code id: tags:
``` python
import tensorflow as tf
from tensorflow import keras
import numpy as np
import matplotlib.pyplot as plt
import pandas as pd
import os,sys
from IPython.display import display, Markdown
from importlib import reload
sys.path.append('..')
import fidle.pwk as ooo
ooo.init()
os.makedirs('./run/models', mode=0o750, exist_ok=True)
```
%% Output
FIDLE 2020 - Practical Work Module
Version : 0.2.8
Run time : Saturday 15 February 2020, 12:32:05
TensorFlow version : 2.0.0
Keras version : 2.2.4-tf
%% Cell type:markdown id: tags:
## Step 2 - Retrieve data
### 2.1 - Option 1 : From Keras
Boston housing is a famous historic dataset, so we can get it directly from [Keras datasets](https://www.tensorflow.org/api_docs/python/tf/keras/datasets)
%% Cell type:raw id: tags:
(x_train, y_train), (x_test, y_test) = keras.datasets.boston_housing.load_data(test_split=0.2, seed=113)
%% Cell type:markdown id: tags:
### 2.2 - Option 2 : From a csv file
More fun !
%% Cell type:code id: tags:
``` python
data = pd.read_csv('./data/BostonHousing.csv', header=0)
display(data.head(5).style.format("{0:.2f}"))
print('Données manquantes : ',data.isna().sum().sum(), ' Shape is : ', data.shape)
```
%% Output
Données manquantes : 0 Shape is : (506, 14)
%% Cell type:markdown id: tags:
## Step 3 - Preparing the data
### 3.1 - Split data
We will use 80% of the data for training and 20% for validation.
x will be input data and y the expected output
%% Cell type:code id: tags:
``` python
# ---- Split => train, test
#
data_train = data.sample(frac=0.7, axis=0)
data_test = data.drop(data_train.index)
# ---- Split => x,y (medv is price)
#
x_train = data_train.drop('medv', axis=1)
y_train = data_train['medv']
x_test = data_test.drop('medv', axis=1)
y_test = data_test['medv']
print('Original data shape was : ',data.shape)
print('x_train : ',x_train.shape, 'y_train : ',y_train.shape)
print('x_test : ',x_test.shape, 'y_test : ',y_test.shape)
```
%% Output
Original data shape was : (506, 14)
x_train : (354, 13) y_train : (354,)
x_test : (152, 13) y_test : (152,)
%% Cell type:markdown id: tags:
### 3.2 - Data normalization
**Note :**
- All input data must be normalized, train and test.
- To do this we will subtract the mean and divide by the standard deviation.
- But test data should not be used in any way, even for normalization.
- The mean and the standard deviation will therefore only be calculated with the train data.
%% Cell type:code id: tags:
``` python
display(x_train.describe().style.format("{0:.2f}").set_caption("Before normalization :"))
mean = x_train.mean()
std = x_train.std()
x_train = (x_train - mean) / std
x_test = (x_test - mean) / std
display(x_train.describe().style.format("{0:.2f}").set_caption("After normalization :"))
x_train, y_train = np.array(x_train), np.array(y_train)
x_test, y_test = np.array(x_test), np.array(y_test)
```
%% Output
%% Cell type:markdown id: tags:
## Step 4 - Build a model
More informations about :
- [Optimizer](https://www.tensorflow.org/api_docs/python/tf/keras/optimizers)
- [Activation](https://www.tensorflow.org/api_docs/python/tf/keras/activations)
- [Loss](https://www.tensorflow.org/api_docs/python/tf/keras/losses)
- [Metrics](https://www.tensorflow.org/api_docs/python/tf/keras/metrics)
%% Cell type:code id: tags:
``` python
def get_model_v1(shape):
model = keras.models.Sequential()
model.add(keras.layers.Input(shape, name="InputLayer"))
model.add(keras.layers.Dense(64, activation='relu', name='Dense_n1'))
model.add(keras.layers.Dense(64, activation='relu', name='Dense_n2'))
model.add(keras.layers.Dense(1, name='Output'))
model.compile(optimizer = 'rmsprop',
loss = 'mse',
metrics = ['mae', 'mse'] )
return model
```
%% Cell type:markdown id: tags:
## 5 - Train the model
### 5.1 - Get it
%% Cell type:code id: tags:
``` python
model=get_model_v1( (13,) )
model.summary()
keras.utils.plot_model( model, to_file='./run/model.png', show_shapes=True, show_layer_names=True, dpi=96)
```
%% Output
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
Dense_n1 (Dense) (None, 64) 896
_________________________________________________________________
Dense_n2 (Dense) (None, 64) 4160
_________________________________________________________________
Output (Dense) (None, 1) 65
=================================================================
Total params: 5,121
Trainable params: 5,121
Non-trainable params: 0
_________________________________________________________________
<IPython.core.display.Image object>
%% Cell type:markdown id: tags:
### 5.2 - Add callback
%% Cell type:code id: tags:
``` python
os.makedirs('./run/models', mode=0o750, exist_ok=True)
save_dir = "./run/models/best_model.h5"
savemodel_callback = tf.keras.callbacks.ModelCheckpoint(filepath=save_dir, verbose=0, save_best_only=True)
```
%% Cell type:markdown id: tags:
### 5.3 - Train it
%% Cell type:code id: tags:
``` python
history = model.fit(x_train,
y_train,
epochs = 100,
batch_size = 10,
verbose = 1,
validation_data = (x_test, y_test),
callbacks = [savemodel_callback])
```
%% Output
Train on 354 samples, validate on 152 samples
Epoch 1/100
354/354 [==============================] - 1s 3ms/sample - loss: 446.5069 - mae: 19.1690 - mse: 446.5069 - val_loss: 328.7387 - val_mae: 16.4455 - val_mse: 328.7387
Epoch 2/100
354/354 [==============================] - 0s 301us/sample - loss: 206.7491 - mae: 12.2281 - mse: 206.7491 - val_loss: 102.8150 - val_mae: 8.6449 - val_mse: 102.8150
Epoch 3/100
354/354 [==============================] - 0s 302us/sample - loss: 65.8724 - mae: 6.2331 - mse: 65.8724 - val_loss: 33.7508 - val_mae: 4.5848 - val_mse: 33.7508
Epoch 4/100
354/354 [==============================] - 0s 318us/sample - loss: 33.4179 - mae: 4.2331 - mse: 33.4179 - val_loss: 27.0058 - val_mae: 3.9154 - val_mse: 27.0058
Epoch 5/100
354/354 [==============================] - 0s 312us/sample - loss: 24.9602 - mae: 3.5624 - mse: 24.9602 - val_loss: 23.2470 - val_mae: 3.5429 - val_mse: 23.2470
Epoch 6/100
354/354 [==============================] - 0s 316us/sample - loss: 21.4080 - mae: 3.2530 - mse: 21.4080 - val_loss: 22.1707 - val_mae: 3.4498 - val_mse: 22.1707
Epoch 7/100
354/354 [==============================] - 0s 262us/sample - loss: 18.3586 - mae: 3.0399 - mse: 18.3586 - val_loss: 24.4102 - val_mae: 3.4754 - val_mse: 24.4102
Epoch 8/100
354/354 [==============================] - 0s 307us/sample - loss: 16.9126 - mae: 2.8925 - mse: 16.9126 - val_loss: 20.1919 - val_mae: 3.2138 - val_mse: 20.1919
Epoch 9/100
354/354 [==============================] - 0s 312us/sample - loss: 15.5047 - mae: 2.7532 - mse: 15.5047 - val_loss: 19.0378 - val_mae: 3.0763 - val_mse: 19.0378
Epoch 10/100
354/354 [==============================] - 0s 273us/sample - loss: 14.5763 - mae: 2.6404 - mse: 14.5763 - val_loss: 19.9752 - val_mae: 3.0986 - val_mse: 19.9752
Epoch 11/100
354/354 [==============================] - 0s 310us/sample - loss: 13.5901 - mae: 2.5801 - mse: 13.5901 - val_loss: 18.9675 - val_mae: 3.0192 - val_mse: 18.9675
Epoch 12/100
354/354 [==============================] - 0s 270us/sample - loss: 12.9341 - mae: 2.5158 - mse: 12.9341 - val_loss: 20.6757 - val_mae: 3.1029 - val_mse: 20.6757
Epoch 13/100
354/354 [==============================] - 0s 311us/sample - loss: 12.4520 - mae: 2.5061 - mse: 12.4520 - val_loss: 17.6596 - val_mae: 2.8839 - val_mse: 17.6596
Epoch 14/100
354/354 [==============================] - 0s 311us/sample - loss: 11.9484 - mae: 2.4710 - mse: 11.9484 - val_loss: 16.7645 - val_mae: 2.8083 - val_mse: 16.7645
Epoch 15/100
354/354 [==============================] - 0s 269us/sample - loss: 11.6260 - mae: 2.3959 - mse: 11.6260 - val_loss: 17.5048 - val_mae: 2.8007 - val_mse: 17.5048
Epoch 16/100
354/354 [==============================] - 0s 267us/sample - loss: 11.2504 - mae: 2.3567 - mse: 11.2504 - val_loss: 18.6748 - val_mae: 2.8771 - val_mse: 18.6748
Epoch 17/100
354/354 [==============================] - 0s 269us/sample - loss: 10.8352 - mae: 2.3051 - mse: 10.8352 - val_loss: 19.4796 - val_mae: 3.0041 - val_mse: 19.4796
Epoch 18/100
354/354 [==============================] - 0s 267us/sample - loss: 10.6488 - mae: 2.3377 - mse: 10.6488 - val_loss: 17.0329 - val_mae: 2.7640 - val_mse: 17.0329
Epoch 19/100
354/354 [==============================] - 0s 273us/sample - loss: 10.2134 - mae: 2.2439 - mse: 10.2134 - val_loss: 18.0589 - val_mae: 2.8565 - val_mse: 18.0589
Epoch 20/100
354/354 [==============================] - 0s 315us/sample - loss: 10.1024 - mae: 2.2432 - mse: 10.1024 - val_loss: 16.5968 - val_mae: 2.7402 - val_mse: 16.5968
Epoch 21/100
354/354 [==============================] - 0s 277us/sample - loss: 10.0576 - mae: 2.2401 - mse: 10.0576 - val_loss: 18.4496 - val_mae: 2.8156 - val_mse: 18.4496
Epoch 22/100
354/354 [==============================] - 0s 269us/sample - loss: 9.6590 - mae: 2.1500 - mse: 9.6590 - val_loss: 18.7084 - val_mae: 2.8309 - val_mse: 18.7084
Epoch 23/100
354/354 [==============================] - 0s 277us/sample - loss: 9.4596 - mae: 2.1967 - mse: 9.4596 - val_loss: 18.0308 - val_mae: 2.7595 - val_mse: 18.0308
Epoch 24/100
354/354 [==============================] - 0s 272us/sample - loss: 9.2778 - mae: 2.1680 - mse: 9.2778 - val_loss: 18.9343 - val_mae: 2.9152 - val_mse: 18.9343
Epoch 25/100
354/354 [==============================] - 0s 267us/sample - loss: 9.1075 - mae: 2.1451 - mse: 9.1076 - val_loss: 18.0646 - val_mae: 2.8202 - val_mse: 18.0646
Epoch 26/100
354/354 [==============================] - 0s 273us/sample - loss: 9.2196 - mae: 2.1282 - mse: 9.2196 - val_loss: 18.7244 - val_mae: 2.8288 - val_mse: 18.7244
Epoch 27/100
354/354 [==============================] - 0s 267us/sample - loss: 8.5733 - mae: 2.0703 - mse: 8.5733 - val_loss: 16.9568 - val_mae: 2.8123 - val_mse: 16.9568
Epoch 28/100
354/354 [==============================] - 0s 309us/sample - loss: 8.6252 - mae: 2.0821 - mse: 8.6252 - val_loss: 16.4984 - val_mae: 2.7069 - val_mse: 16.4984
Epoch 29/100
354/354 [==============================] - 0s 307us/sample - loss: 8.6336 - mae: 2.0822 - mse: 8.6336 - val_loss: 16.0498 - val_mae: 2.6532 - val_mse: 16.0498
Epoch 30/100
354/354 [==============================] - 0s 321us/sample - loss: 8.5071 - mae: 2.0379 - mse: 8.5071 - val_loss: 15.1042 - val_mae: 2.6004 - val_mse: 15.1042
Epoch 31/100
354/354 [==============================] - 0s 273us/sample - loss: 8.2888 - mae: 2.0627 - mse: 8.2888 - val_loss: 16.2730 - val_mae: 2.7019 - val_mse: 16.2730
Epoch 32/100
354/354 [==============================] - 0s 271us/sample - loss: 8.2021 - mae: 2.0000 - mse: 8.2021 - val_loss: 17.2852 - val_mae: 2.7962 - val_mse: 17.2852
Epoch 33/100
354/354 [==============================] - 0s 272us/sample - loss: 8.2973 - mae: 2.0336 - mse: 8.2973 - val_loss: 16.8973 - val_mae: 2.7318 - val_mse: 16.8973
Epoch 34/100
354/354 [==============================] - 0s 257us/sample - loss: 8.1033 - mae: 2.0105 - mse: 8.1033 - val_loss: 16.6509 - val_mae: 2.8218 - val_mse: 16.6509
Epoch 35/100
354/354 [==============================] - 0s 272us/sample - loss: 8.0724 - mae: 2.0170 - mse: 8.0724 - val_loss: 16.0802 - val_mae: 2.6733 - val_mse: 16.0802
Epoch 36/100
354/354 [==============================] - 0s 257us/sample - loss: 7.7939 - mae: 1.9606 - mse: 7.7939 - val_loss: 17.1008 - val_mae: 2.7384 - val_mse: 17.1008
Epoch 37/100
354/354 [==============================] - 0s 269us/sample - loss: 7.7812 - mae: 1.9719 - mse: 7.7812 - val_loss: 16.3472 - val_mae: 2.6939 - val_mse: 16.3472
Epoch 38/100
354/354 [==============================] - 0s 276us/sample - loss: 7.4494 - mae: 1.9224 - mse: 7.4494 - val_loss: 19.3916 - val_mae: 2.9414 - val_mse: 19.3916
Epoch 39/100
354/354 [==============================] - 0s 271us/sample - loss: 7.8023 - mae: 1.9978 - mse: 7.8023 - val_loss: 16.3499 - val_mae: 2.7018 - val_mse: 16.3499
Epoch 40/100
354/354 [==============================] - 0s 270us/sample - loss: 7.3681 - mae: 1.9293 - mse: 7.3681 - val_loss: 16.0445 - val_mae: 2.6872 - val_mse: 16.0445
Epoch 41/100
354/354 [==============================] - 0s 267us/sample - loss: 7.3013 - mae: 1.8820 - mse: 7.3013 - val_loss: 16.5657 - val_mae: 2.7222 - val_mse: 16.5657
Epoch 42/100
354/354 [==============================] - 0s 274us/sample - loss: 7.3978 - mae: 1.9154 - mse: 7.3978 - val_loss: 15.9821 - val_mae: 2.6576 - val_mse: 15.9821
Epoch 43/100
354/354 [==============================] - 0s 319us/sample - loss: 6.9832 - mae: 1.9037 - mse: 6.9832 - val_loss: 14.4977 - val_mae: 2.5418 - val_mse: 14.4977
Epoch 44/100
354/354 [==============================] - 0s 269us/sample - loss: 7.2307 - mae: 1.8968 - mse: 7.2307 - val_loss: 15.0962 - val_mae: 2.6188 - val_mse: 15.0962
Epoch 45/100
354/354 [==============================] - 0s 256us/sample - loss: 7.0289 - mae: 1.8685 - mse: 7.0289 - val_loss: 17.0531 - val_mae: 2.8123 - val_mse: 17.0531
Epoch 46/100
354/354 [==============================] - 0s 270us/sample - loss: 6.9010 - mae: 1.8537 - mse: 6.9010 - val_loss: 16.7469 - val_mae: 2.7081 - val_mse: 16.7469
Epoch 47/100
354/354 [==============================] - 0s 268us/sample - loss: 6.9256 - mae: 1.8664 - mse: 6.9256 - val_loss: 16.1227 - val_mae: 2.7760 - val_mse: 16.1227
Epoch 48/100
354/354 [==============================] - 0s 273us/sample - loss: 6.8333 - mae: 1.8552 - mse: 6.8333 - val_loss: 14.9262 - val_mae: 2.6213 - val_mse: 14.9262
Epoch 49/100
354/354 [==============================] - 0s 313us/sample - loss: 6.7351 - mae: 1.8375 - mse: 6.7351 - val_loss: 14.2252 - val_mae: 2.5309 - val_mse: 14.2252
Epoch 50/100
354/354 [==============================] - 0s 276us/sample - loss: 6.6672 - mae: 1.7913 - mse: 6.6672 - val_loss: 16.5652 - val_mae: 2.7693 - val_mse: 16.5652
Epoch 51/100
354/354 [==============================] - 0s 271us/sample - loss: 6.6222 - mae: 1.8325 - mse: 6.6222 - val_loss: 14.8928 - val_mae: 2.5921 - val_mse: 14.8928
Epoch 52/100
354/354 [==============================] - 0s 271us/sample - loss: 6.5606 - mae: 1.8150 - mse: 6.5606 - val_loss: 14.7382 - val_mae: 2.6124 - val_mse: 14.7382
Epoch 53/100
354/354 [==============================] - 0s 273us/sample - loss: 6.5737 - mae: 1.7757 - mse: 6.5737 - val_loss: 14.8866 - val_mae: 2.6357 - val_mse: 14.8866
Epoch 54/100
354/354 [==============================] - 0s 264us/sample - loss: 6.3009 - mae: 1.7569 - mse: 6.3009 - val_loss: 14.6100 - val_mae: 2.6115 - val_mse: 14.6100
Epoch 55/100
354/354 [==============================] - 0s 272us/sample - loss: 6.2524 - mae: 1.7679 - mse: 6.2524 - val_loss: 17.4939 - val_mae: 2.8652 - val_mse: 17.4939
Epoch 56/100
354/354 [==============================] - 0s 319us/sample - loss: 6.2461 - mae: 1.7830 - mse: 6.2461 - val_loss: 14.0397 - val_mae: 2.5829 - val_mse: 14.0397
Epoch 57/100
354/354 [==============================] - 0s 267us/sample - loss: 6.3124 - mae: 1.7788 - mse: 6.3124 - val_loss: 15.4946 - val_mae: 2.7133 - val_mse: 15.4946
Epoch 58/100
354/354 [==============================] - 0s 269us/sample - loss: 6.1133 - mae: 1.7282 - mse: 6.1133 - val_loss: 14.5244 - val_mae: 2.5982 - val_mse: 14.5244
Epoch 59/100
354/354 [==============================] - 0s 259us/sample - loss: 6.2866 - mae: 1.7860 - mse: 6.2866 - val_loss: 15.8915 - val_mae: 2.7331 - val_mse: 15.8915
Epoch 60/100
354/354 [==============================] - 0s 311us/sample - loss: 5.9945 - mae: 1.7178 - mse: 5.9945 - val_loss: 13.2656 - val_mae: 2.5189 - val_mse: 13.2656
Epoch 61/100
354/354 [==============================] - 0s 263us/sample - loss: 6.0649 - mae: 1.7064 - mse: 6.0649 - val_loss: 15.4134 - val_mae: 2.7351 - val_mse: 15.4134
Epoch 62/100
354/354 [==============================] - 0s 268us/sample - loss: 5.9954 - mae: 1.6767 - mse: 5.9954 - val_loss: 13.8741 - val_mae: 2.5721 - val_mse: 13.8741
Epoch 63/100
354/354 [==============================] - 0s 254us/sample - loss: 5.9648 - mae: 1.7023 - mse: 5.9648 - val_loss: 15.1974 - val_mae: 2.6602 - val_mse: 15.1974
Epoch 64/100
354/354 [==============================] - 0s 272us/sample - loss: 5.7276 - mae: 1.7202 - mse: 5.7276 - val_loss: 14.5766 - val_mae: 2.6508 - val_mse: 14.5766
Epoch 65/100
354/354 [==============================] - 0s 266us/sample - loss: 5.8443 - mae: 1.6907 - mse: 5.8443 - val_loss: 15.5797 - val_mae: 2.6848 - val_mse: 15.5797
Epoch 66/100
354/354 [==============================] - 0s 273us/sample - loss: 5.8195 - mae: 1.7295 - mse: 5.8195 - val_loss: 14.5484 - val_mae: 2.6527 - val_mse: 14.5484
Epoch 67/100
354/354 [==============================] - 0s 266us/sample - loss: 5.8216 - mae: 1.6966 - mse: 5.8216 - val_loss: 14.3616 - val_mae: 2.5733 - val_mse: 14.3616
Epoch 68/100
354/354 [==============================] - 0s 271us/sample - loss: 5.6572 - mae: 1.6543 - mse: 5.6572 - val_loss: 16.1438 - val_mae: 2.8151 - val_mse: 16.1438
Epoch 69/100
354/354 [==============================] - 0s 259us/sample - loss: 5.5142 - mae: 1.6657 - mse: 5.5142 - val_loss: 14.2295 - val_mae: 2.5796 - val_mse: 14.2295
Epoch 70/100
354/354 [==============================] - 0s 273us/sample - loss: 5.4965 - mae: 1.6313 - mse: 5.4965 - val_loss: 15.2662 - val_mae: 2.6980 - val_mse: 15.2662
Epoch 71/100
354/354 [==============================] - 0s 270us/sample - loss: 5.4534 - mae: 1.6717 - mse: 5.4534 - val_loss: 14.5025 - val_mae: 2.6441 - val_mse: 14.5025
Epoch 72/100
354/354 [==============================] - 0s 253us/sample - loss: 5.5146 - mae: 1.6526 - mse: 5.5146 - val_loss: 13.7906 - val_mae: 2.5753 - val_mse: 13.7906
Epoch 73/100
354/354 [==============================] - 0s 272us/sample - loss: 5.4499 - mae: 1.6130 - mse: 5.4499 - val_loss: 15.1649 - val_mae: 2.7624 - val_mse: 15.1649
Epoch 74/100
354/354 [==============================] - 0s 309us/sample - loss: 5.3808 - mae: 1.6297 - mse: 5.3808 - val_loss: 12.9326 - val_mae: 2.5007 - val_mse: 12.9326
Epoch 75/100
354/354 [==============================] - 0s 258us/sample - loss: 5.3546 - mae: 1.6313 - mse: 5.3546 - val_loss: 13.6397 - val_mae: 2.5810 - val_mse: 13.6397
Epoch 76/100
354/354 [==============================] - 0s 265us/sample - loss: 5.1666 - mae: 1.5998 - mse: 5.1666 - val_loss: 15.6069 - val_mae: 2.7630 - val_mse: 15.6069
Epoch 77/100
354/354 [==============================] - 0s 272us/sample - loss: 5.2465 - mae: 1.6192 - mse: 5.2465 - val_loss: 14.8084 - val_mae: 2.6388 - val_mse: 14.8084
Epoch 78/100
354/354 [==============================] - 0s 265us/sample - loss: 5.1107 - mae: 1.5772 - mse: 5.1107 - val_loss: 13.6319 - val_mae: 2.5756 - val_mse: 13.6319
Epoch 79/100
354/354 [==============================] - 0s 272us/sample - loss: 5.2677 - mae: 1.5989 - mse: 5.2677 - val_loss: 15.0306 - val_mae: 2.7715 - val_mse: 15.0306
Epoch 80/100
354/354 [==============================] - 0s 274us/sample - loss: 5.0534 - mae: 1.5504 - mse: 5.0534 - val_loss: 13.3917 - val_mae: 2.5352 - val_mse: 13.3917
Epoch 81/100
354/354 [==============================] - 0s 272us/sample - loss: 5.1013 - mae: 1.5826 - mse: 5.1013 - val_loss: 14.6761 - val_mae: 2.7158 - val_mse: 14.6761
Epoch 82/100
354/354 [==============================] - 0s 258us/sample - loss: 5.1137 - mae: 1.5984 - mse: 5.1137 - val_loss: 14.7063 - val_mae: 2.6576 - val_mse: 14.7063
Epoch 83/100
354/354 [==============================] - 0s 269us/sample - loss: 4.9343 - mae: 1.5545 - mse: 4.9343 - val_loss: 13.6205 - val_mae: 2.5494 - val_mse: 13.6205
Epoch 84/100
354/354 [==============================] - 0s 277us/sample - loss: 4.9839 - mae: 1.5815 - mse: 4.9839 - val_loss: 13.3857 - val_mae: 2.6047 - val_mse: 13.3857
Epoch 85/100
354/354 [==============================] - 0s 277us/sample - loss: 4.9946 - mae: 1.5818 - mse: 4.9946 - val_loss: 14.1012 - val_mae: 2.6176 - val_mse: 14.1012
Epoch 86/100
354/354 [==============================] - 0s 273us/sample - loss: 4.7884 - mae: 1.5321 - mse: 4.7884 - val_loss: 14.5182 - val_mae: 2.6687 - val_mse: 14.5182
Epoch 87/100
354/354 [==============================] - 0s 311us/sample - loss: 4.8134 - mae: 1.5660 - mse: 4.8134 - val_loss: 12.7966 - val_mae: 2.5734 - val_mse: 12.7966
Epoch 88/100
354/354 [==============================] - 0s 273us/sample - loss: 4.7923 - mae: 1.5483 - mse: 4.7923 - val_loss: 14.4001 - val_mae: 2.6707 - val_mse: 14.4001
Epoch 89/100
354/354 [==============================] - 0s 274us/sample - loss: 4.6705 - mae: 1.5086 - mse: 4.6705 - val_loss: 15.3677 - val_mae: 2.7359 - val_mse: 15.3677
Epoch 90/100
354/354 [==============================] - 0s 280us/sample - loss: 4.8776 - mae: 1.5806 - mse: 4.8776 - val_loss: 14.4442 - val_mae: 2.6343 - val_mse: 14.4442
Epoch 91/100
354/354 [==============================] - 0s 260us/sample - loss: 4.6349 - mae: 1.5300 - mse: 4.6349 - val_loss: 14.2969 - val_mae: 2.7718 - val_mse: 14.2969
Epoch 92/100
354/354 [==============================] - 0s 273us/sample - loss: 4.7835 - mae: 1.5637 - mse: 4.7835 - val_loss: 13.1123 - val_mae: 2.5578 - val_mse: 13.1123
Epoch 93/100
354/354 [==============================] - 0s 277us/sample - loss: 4.6759 - mae: 1.5259 - mse: 4.6759 - val_loss: 14.3508 - val_mae: 2.6888 - val_mse: 14.3507
Epoch 94/100
354/354 [==============================] - 0s 273us/sample - loss: 4.7856 - mae: 1.5560 - mse: 4.7856 - val_loss: 14.5237 - val_mae: 2.6956 - val_mse: 14.5237
Epoch 95/100
354/354 [==============================] - 0s 313us/sample - loss: 4.7038 - mae: 1.5331 - mse: 4.7038 - val_loss: 12.7707 - val_mae: 2.5393 - val_mse: 12.7707
Epoch 96/100
354/354 [==============================] - 0s 277us/sample - loss: 4.6006 - mae: 1.5331 - mse: 4.6006 - val_loss: 13.8540 - val_mae: 2.6720 - val_mse: 13.8540
Epoch 97/100
354/354 [==============================] - 0s 269us/sample - loss: 4.4720 - mae: 1.4912 - mse: 4.4720 - val_loss: 13.1524 - val_mae: 2.6311 - val_mse: 13.1524
Epoch 98/100
354/354 [==============================] - 0s 309us/sample - loss: 4.4242 - mae: 1.4854 - mse: 4.4242 - val_loss: 11.7020 - val_mae: 2.4886 - val_mse: 11.7020
Epoch 99/100
354/354 [==============================] - 0s 280us/sample - loss: 4.5642 - mae: 1.4920 - mse: 4.5642 - val_loss: 12.6523 - val_mae: 2.5232 - val_mse: 12.6523
Epoch 100/100
354/354 [==============================] - 0s 274us/sample - loss: 4.1971 - mae: 1.4564 - mse: 4.1971 - val_loss: 18.7164 - val_mae: 3.0774 - val_mse: 18.7164
%% Cell type:markdown id: tags:
## Step 6 - Evaluate
### 6.1 - Model evaluation
MAE = Mean Absolute Error (between the labels and predictions)
A mae equal to 3 represents an average error in prediction of $3k.
%% Cell type:code id: tags:
``` python
score = model.evaluate(x_test, y_test, verbose=0)
print('x_test / loss : {:5.4f}'.format(score[0]))
print('x_test / mae : {:5.4f}'.format(score[1]))
print('x_test / mse : {:5.4f}'.format(score[2]))
```
%% Output
x_test / loss : 18.7164
x_test / mae : 3.0774
x_test / mse : 18.7164
%% Cell type:markdown id: tags:
### 6.2 - Training history
What was the best result during our training ?
%% Cell type:code id: tags:
``` python
print("min( val_mae ) : {:.4f}".format( min(history.history["val_mae"]) ) )
```
%% Output
min( val_mae ) : 2.4886
%% Cell type:code id: tags:
``` python
ooo.plot_history(history, plot={'MSE' :['mse', 'val_mse'],
'MAE' :['mae', 'val_mae'],
'LOSS':['loss','val_loss']})
```
%% Output
%% Cell type:markdown id: tags:
## Step 7 - Restore a model :
%% Cell type:markdown id: tags:
### 7.1 - Reload model
%% Cell type:code id: tags:
``` python
loaded_model = tf.keras.models.load_model('./run/models/best_model.h5')
loaded_model.summary()
print("Loaded.")
```
%% Output
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
Dense_n1 (Dense) (None, 64) 896
_________________________________________________________________
Dense_n2 (Dense) (None, 64) 4160
_________________________________________________________________
Output (Dense) (None, 1) 65
=================================================================
Total params: 5,121
Trainable params: 5,121
Non-trainable params: 0
_________________________________________________________________
Loaded.
%% Cell type:markdown id: tags:
### 7.2 - Evaluate it :
%% Cell type:code id: tags:
``` python
score = loaded_model.evaluate(x_test, y_test, verbose=0)
print('x_test / loss : {:5.4f}'.format(score[0]))
print('x_test / mae : {:5.4f}'.format(score[1]))
print('x_test / mse : {:5.4f}'.format(score[2]))
```
%% Output
x_test / loss : 11.7020
x_test / mae : 2.4886
x_test / mse : 11.7020
%% Cell type:markdown id: tags:
### 7.3 - Make a prediction
%% Cell type:code id: tags:
``` python
mon_test=[-0.20113196, -0.48631663, 1.23572348, -0.26929877, 2.67879106,
-0.89623587, 1.09961251, -1.05826704, -0.55823117, -0.06159088,
-1.76085159, -1.97039608, 0.52775666]
mon_test=np.array(mon_test).reshape(1,13)
```
%% Cell type:code id: tags:
``` python
predictions = loaded_model.predict( mon_test )
print("Prédiction : {:.2f} K$ Reality : {:.2f} K$".format(predictions[0][0], y_train[13]))
```
%% Output
Prédiction : 16.20 K$ Reality : 21.70 K$
%% Cell type:markdown id: tags:
-----
That's all folks !
"crim","zn","indus","chas","nox","rm","age","dis","rad","tax","ptratio","b","lstat","medv"
0.00632,18,2.31,"0",0.538,6.575,65.2,4.09,1,296,15.3,396.9,4.98,24
0.02731,0,7.07,"0",0.469,6.421,78.9,4.9671,2,242,17.8,396.9,9.14,21.6
0.02729,0,7.07,"0",0.469,7.185,61.1,4.9671,2,242,17.8,392.83,4.03,34.7
0.03237,0,2.18,"0",0.458,6.998,45.8,6.0622,3,222,18.7,394.63,2.94,33.4
0.06905,0,2.18,"0",0.458,7.147,54.2,6.0622,3,222,18.7,396.9,5.33,36.2
0.02985,0,2.18,"0",0.458,6.43,58.7,6.0622,3,222,18.7,394.12,5.21,28.7
0.08829,12.5,7.87,"0",0.524,6.012,66.6,5.5605,5,311,15.2,395.6,12.43,22.9
0.14455,12.5,7.87,"0",0.524,6.172,96.1,5.9505,5,311,15.2,396.9,19.15,27.1
0.21124,12.5,7.87,"0",0.524,5.631,100,6.0821,5,311,15.2,386.63,29.93,16.5
0.17004,12.5,7.87,"0",0.524,6.004,85.9,6.5921,5,311,15.2,386.71,17.1,18.9
0.22489,12.5,7.87,"0",0.524,6.377,94.3,6.3467,5,311,15.2,392.52,20.45,15
0.11747,12.5,7.87,"0",0.524,6.009,82.9,6.2267,5,311,15.2,396.9,13.27,18.9
0.09378,12.5,7.87,"0",0.524,5.889,39,5.4509,5,311,15.2,390.5,15.71,21.7
0.62976,0,8.14,"0",0.538,5.949,61.8,4.7075,4,307,21,396.9,8.26,20.4
0.63796,0,8.14,"0",0.538,6.096,84.5,4.4619,4,307,21,380.02,10.26,18.2
0.62739,0,8.14,"0",0.538,5.834,56.5,4.4986,4,307,21,395.62,8.47,19.9
1.05393,0,8.14,"0",0.538,5.935,29.3,4.4986,4,307,21,386.85,6.58,23.1
0.7842,0,8.14,"0",0.538,5.99,81.7,4.2579,4,307,21,386.75,14.67,17.5
0.80271,0,8.14,"0",0.538,5.456,36.6,3.7965,4,307,21,288.99,11.69,20.2
0.7258,0,8.14,"0",0.538,5.727,69.5,3.7965,4,307,21,390.95,11.28,18.2
1.25179,0,8.14,"0",0.538,5.57,98.1,3.7979,4,307,21,376.57,21.02,13.6
0.85204,0,8.14,"0",0.538,5.965,89.2,4.0123,4,307,21,392.53,13.83,19.6
1.23247,0,8.14,"0",0.538,6.142,91.7,3.9769,4,307,21,396.9,18.72,15.2
0.98843,0,8.14,"0",0.538,5.813,100,4.0952,4,307,21,394.54,19.88,14.5
0.75026,0,8.14,"0",0.538,5.924,94.1,4.3996,4,307,21,394.33,16.3,15.6
0.84054,0,8.14,"0",0.538,5.599,85.7,4.4546,4,307,21,303.42,16.51,13.9
0.67191,0,8.14,"0",0.538,5.813,90.3,4.682,4,307,21,376.88,14.81,16.6
0.95577,0,8.14,"0",0.538,6.047,88.8,4.4534,4,307,21,306.38,17.28,14.8
0.77299,0,8.14,"0",0.538,6.495,94.4,4.4547,4,307,21,387.94,12.8,18.4
1.00245,0,8.14,"0",0.538,6.674,87.3,4.239,4,307,21,380.23,11.98,21
1.13081,0,8.14,"0",0.538,5.713,94.1,4.233,4,307,21,360.17,22.6,12.7
1.35472,0,8.14,"0",0.538,6.072,100,4.175,4,307,21,376.73,13.04,14.5
1.38799,0,8.14,"0",0.538,5.95,82,3.99,4,307,21,232.6,27.71,13.2
1.15172,0,8.14,"0",0.538,5.701,95,3.7872,4,307,21,358.77,18.35,13.1
1.61282,0,8.14,"0",0.538,6.096,96.9,3.7598,4,307,21,248.31,20.34,13.5
0.06417,0,5.96,"0",0.499,5.933,68.2,3.3603,5,279,19.2,396.9,9.68,18.9
0.09744,0,5.96,"0",0.499,5.841,61.4,3.3779,5,279,19.2,377.56,11.41,20
0.08014,0,5.96,"0",0.499,5.85,41.5,3.9342,5,279,19.2,396.9,8.77,21
0.17505,0,5.96,"0",0.499,5.966,30.2,3.8473,5,279,19.2,393.43,10.13,24.7
0.02763,75,2.95,"0",0.428,6.595,21.8,5.4011,3,252,18.3,395.63,4.32,30.8
0.03359,75,2.95,"0",0.428,7.024,15.8,5.4011,3,252,18.3,395.62,1.98,34.9
0.12744,0,6.91,"0",0.448,6.77,2.9,5.7209,3,233,17.9,385.41,4.84,26.6
0.1415,0,6.91,"0",0.448,6.169,6.6,5.7209,3,233,17.9,383.37,5.81,25.3
0.15936,0,6.91,"0",0.448,6.211,6.5,5.7209,3,233,17.9,394.46,7.44,24.7
0.12269,0,6.91,"0",0.448,6.069,40,5.7209,3,233,17.9,389.39,9.55,21.2
0.17142,0,6.91,"0",0.448,5.682,33.8,5.1004,3,233,17.9,396.9,10.21,19.3
0.18836,0,6.91,"0",0.448,5.786,33.3,5.1004,3,233,17.9,396.9,14.15,20
0.22927,0,6.91,"0",0.448,6.03,85.5,5.6894,3,233,17.9,392.74,18.8,16.6
0.25387,0,6.91,"0",0.448,5.399,95.3,5.87,3,233,17.9,396.9,30.81,14.4
0.21977,0,6.91,"0",0.448,5.602,62,6.0877,3,233,17.9,396.9,16.2,19.4
0.08873,21,5.64,"0",0.439,5.963,45.7,6.8147,4,243,16.8,395.56,13.45,19.7
0.04337,21,5.64,"0",0.439,6.115,63,6.8147,4,243,16.8,393.97,9.43,20.5
0.0536,21,5.64,"0",0.439,6.511,21.1,6.8147,4,243,16.8,396.9,5.28,25
0.04981,21,5.64,"0",0.439,5.998,21.4,6.8147,4,243,16.8,396.9,8.43,23.4
0.0136,75,4,"0",0.41,5.888,47.6,7.3197,3,469,21.1,396.9,14.8,18.9
0.01311,90,1.22,"0",0.403,7.249,21.9,8.6966,5,226,17.9,395.93,4.81,35.4
0.02055,85,0.74,"0",0.41,6.383,35.7,9.1876,2,313,17.3,396.9,5.77,24.7
0.01432,100,1.32,"0",0.411,6.816,40.5,8.3248,5,256,15.1,392.9,3.95,31.6
0.15445,25,5.13,"0",0.453,6.145,29.2,7.8148,8,284,19.7,390.68,6.86,23.3
0.10328,25,5.13,"0",0.453,5.927,47.2,6.932,8,284,19.7,396.9,9.22,19.6
0.14932,25,5.13,"0",0.453,5.741,66.2,7.2254,8,284,19.7,395.11,13.15,18.7
0.17171,25,5.13,"0",0.453,5.966,93.4,6.8185,8,284,19.7,378.08,14.44,16
0.11027,25,5.13,"0",0.453,6.456,67.8,7.2255,8,284,19.7,396.9,6.73,22.2
0.1265,25,5.13,"0",0.453,6.762,43.4,7.9809,8,284,19.7,395.58,9.5,25
0.01951,17.5,1.38,"0",0.4161,7.104,59.5,9.2229,3,216,18.6,393.24,8.05,33
0.03584,80,3.37,"0",0.398,6.29,17.8,6.6115,4,337,16.1,396.9,4.67,23.5
0.04379,80,3.37,"0",0.398,5.787,31.1,6.6115,4,337,16.1,396.9,10.24,19.4
0.05789,12.5,6.07,"0",0.409,5.878,21.4,6.498,4,345,18.9,396.21,8.1,22
0.13554,12.5,6.07,"0",0.409,5.594,36.8,6.498,4,345,18.9,396.9,13.09,17.4
0.12816,12.5,6.07,"0",0.409,5.885,33,6.498,4,345,18.9,396.9,8.79,20.9
0.08826,0,10.81,"0",0.413,6.417,6.6,5.2873,4,305,19.2,383.73,6.72,24.2
0.15876,0,10.81,"0",0.413,5.961,17.5,5.2873,4,305,19.2,376.94,9.88,21.7
0.09164,0,10.81,"0",0.413,6.065,7.8,5.2873,4,305,19.2,390.91,5.52,22.8
0.19539,0,10.81,"0",0.413,6.245,6.2,5.2873,4,305,19.2,377.17,7.54,23.4
0.07896,0,12.83,"0",0.437,6.273,6,4.2515,5,398,18.7,394.92,6.78,24.1
0.09512,0,12.83,"0",0.437,6.286,45,4.5026,5,398,18.7,383.23,8.94,21.4
0.10153,0,12.83,"0",0.437,6.279,74.5,4.0522,5,398,18.7,373.66,11.97,20
0.08707,0,12.83,"0",0.437,6.14,45.8,4.0905,5,398,18.7,386.96,10.27,20.8
0.05646,0,12.83,"0",0.437,6.232,53.7,5.0141,5,398,18.7,386.4,12.34,21.2
0.08387,0,12.83,"0",0.437,5.874,36.6,4.5026,5,398,18.7,396.06,9.1,20.3
0.04113,25,4.86,"0",0.426,6.727,33.5,5.4007,4,281,19,396.9,5.29,28
0.04462,25,4.86,"0",0.426,6.619,70.4,5.4007,4,281,19,395.63,7.22,23.9
0.03659,25,4.86,"0",0.426,6.302,32.2,5.4007,4,281,19,396.9,6.72,24.8
0.03551,25,4.86,"0",0.426,6.167,46.7,5.4007,4,281,19,390.64,7.51,22.9
0.05059,0,4.49,"0",0.449,6.389,48,4.7794,3,247,18.5,396.9,9.62,23.9
0.05735,0,4.49,"0",0.449,6.63,56.1,4.4377,3,247,18.5,392.3,6.53,26.6
0.05188,0,4.49,"0",0.449,6.015,45.1,4.4272,3,247,18.5,395.99,12.86,22.5
0.07151,0,4.49,"0",0.449,6.121,56.8,3.7476,3,247,18.5,395.15,8.44,22.2
0.0566,0,3.41,"0",0.489,7.007,86.3,3.4217,2,270,17.8,396.9,5.5,23.6
0.05302,0,3.41,"0",0.489,7.079,63.1,3.4145,2,270,17.8,396.06,5.7,28.7
0.04684,0,3.41,"0",0.489,6.417,66.1,3.0923,2,270,17.8,392.18,8.81,22.6
0.03932,0,3.41,"0",0.489,6.405,73.9,3.0921,2,270,17.8,393.55,8.2,22
0.04203,28,15.04,"0",0.464,6.442,53.6,3.6659,4,270,18.2,395.01,8.16,22.9
0.02875,28,15.04,"0",0.464,6.211,28.9,3.6659,4,270,18.2,396.33,6.21,25
0.04294,28,15.04,"0",0.464,6.249,77.3,3.615,4,270,18.2,396.9,10.59,20.6
0.12204,0,2.89,"0",0.445,6.625,57.8,3.4952,2,276,18,357.98,6.65,28.4
0.11504,0,2.89,"0",0.445,6.163,69.6,3.4952,2,276,18,391.83,11.34,21.4
0.12083,0,2.89,"0",0.445,8.069,76,3.4952,2,276,18,396.9,4.21,38.7
0.08187,0,2.89,"0",0.445,7.82,36.9,3.4952,2,276,18,393.53,3.57,43.8
0.0686,0,2.89,"0",0.445,7.416,62.5,3.4952,2,276,18,396.9,6.19,33.2
0.14866,0,8.56,"0",0.52,6.727,79.9,2.7778,5,384,20.9,394.76,9.42,27.5
0.11432,0,8.56,"0",0.52,6.781,71.3,2.8561,5,384,20.9,395.58,7.67,26.5
0.22876,0,8.56,"0",0.52,6.405,85.4,2.7147,5,384,20.9,70.8,10.63,18.6
0.21161,0,8.56,"0",0.52,6.137,87.4,2.7147,5,384,20.9,394.47,13.44,19.3
0.1396,0,8.56,"0",0.52,6.167,90,2.421,5,384,20.9,392.69,12.33,20.1
0.13262,0,8.56,"0",0.52,5.851,96.7,2.1069,5,384,20.9,394.05,16.47,19.5
0.1712,0,8.56,"0",0.52,5.836,91.9,2.211,5,384,20.9,395.67,18.66,19.5
0.13117,0,8.56,"0",0.52,6.127,85.2,2.1224,5,384,20.9,387.69,14.09,20.4
0.12802,0,8.56,"0",0.52,6.474,97.1,2.4329,5,384,20.9,395.24,12.27,19.8
0.26363,0,8.56,"0",0.52,6.229,91.2,2.5451,5,384,20.9,391.23,15.55,19.4
0.10793,0,8.56,"0",0.52,6.195,54.4,2.7778,5,384,20.9,393.49,13,21.7
0.10084,0,10.01,"0",0.547,6.715,81.6,2.6775,6,432,17.8,395.59,10.16,22.8
0.12329,0,10.01,"0",0.547,5.913,92.9,2.3534,6,432,17.8,394.95,16.21,18.8
0.22212,0,10.01,"0",0.547,6.092,95.4,2.548,6,432,17.8,396.9,17.09,18.7
0.14231,0,10.01,"0",0.547,6.254,84.2,2.2565,6,432,17.8,388.74,10.45,18.5
0.17134,0,10.01,"0",0.547,5.928,88.2,2.4631,6,432,17.8,344.91,15.76,18.3
0.13158,0,10.01,"0",0.547,6.176,72.5,2.7301,6,432,17.8,393.3,12.04,21.2
0.15098,0,10.01,"0",0.547,6.021,82.6,2.7474,6,432,17.8,394.51,10.3,19.2
0.13058,0,10.01,"0",0.547,5.872,73.1,2.4775,6,432,17.8,338.63,15.37,20.4
0.14476,0,10.01,"0",0.547,5.731,65.2,2.7592,6,432,17.8,391.5,13.61,19.3
0.06899,0,25.65,"0",0.581,5.87,69.7,2.2577,2,188,19.1,389.15,14.37,22
0.07165,0,25.65,"0",0.581,6.004,84.1,2.1974,2,188,19.1,377.67,14.27,20.3
0.09299,0,25.65,"0",0.581,5.961,92.9,2.0869,2,188,19.1,378.09,17.93,20.5
0.15038,0,25.65,"0",0.581,5.856,97,1.9444,2,188,19.1,370.31,25.41,17.3
0.09849,0,25.65,"0",0.581,5.879,95.8,2.0063,2,188,19.1,379.38,17.58,18.8
0.16902,0,25.65,"0",0.581,5.986,88.4,1.9929,2,188,19.1,385.02,14.81,21.4
0.38735,0,25.65,"0",0.581,5.613,95.6,1.7572,2,188,19.1,359.29,27.26,15.7
0.25915,0,21.89,"0",0.624,5.693,96,1.7883,4,437,21.2,392.11,17.19,16.2
0.32543,0,21.89,"0",0.624,6.431,98.8,1.8125,4,437,21.2,396.9,15.39,18
0.88125,0,21.89,"0",0.624,5.637,94.7,1.9799,4,437,21.2,396.9,18.34,14.3
0.34006,0,21.89,"0",0.624,6.458,98.9,2.1185,4,437,21.2,395.04,12.6,19.2
1.19294,0,21.89,"0",0.624,6.326,97.7,2.271,4,437,21.2,396.9,12.26,19.6
0.59005,0,21.89,"0",0.624,6.372,97.9,2.3274,4,437,21.2,385.76,11.12,23
0.32982,0,21.89,"0",0.624,5.822,95.4,2.4699,4,437,21.2,388.69,15.03,18.4
0.97617,0,21.89,"0",0.624,5.757,98.4,2.346,4,437,21.2,262.76,17.31,15.6
0.55778,0,21.89,"0",0.624,6.335,98.2,2.1107,4,437,21.2,394.67,16.96,18.1
0.32264,0,21.89,"0",0.624,5.942,93.5,1.9669,4,437,21.2,378.25,16.9,17.4
0.35233,0,21.89,"0",0.624,6.454,98.4,1.8498,4,437,21.2,394.08,14.59,17.1
0.2498,0,21.89,"0",0.624,5.857,98.2,1.6686,4,437,21.2,392.04,21.32,13.3
0.54452,0,21.89,"0",0.624,6.151,97.9,1.6687,4,437,21.2,396.9,18.46,17.8
0.2909,0,21.89,"0",0.624,6.174,93.6,1.6119,4,437,21.2,388.08,24.16,14
1.62864,0,21.89,"0",0.624,5.019,100,1.4394,4,437,21.2,396.9,34.41,14.4
3.32105,0,19.58,"1",0.871,5.403,100,1.3216,5,403,14.7,396.9,26.82,13.4
4.0974,0,19.58,"0",0.871,5.468,100,1.4118,5,403,14.7,396.9,26.42,15.6
2.77974,0,19.58,"0",0.871,4.903,97.8,1.3459,5,403,14.7,396.9,29.29,11.8
2.37934,0,19.58,"0",0.871,6.13,100,1.4191,5,403,14.7,172.91,27.8,13.8
2.15505,0,19.58,"0",0.871,5.628,100,1.5166,5,403,14.7,169.27,16.65,15.6
2.36862,0,19.58,"0",0.871,4.926,95.7,1.4608,5,403,14.7,391.71,29.53,14.6
2.33099,0,19.58,"0",0.871,5.186,93.8,1.5296,5,403,14.7,356.99,28.32,17.8
2.73397,0,19.58,"0",0.871,5.597,94.9,1.5257,5,403,14.7,351.85,21.45,15.4
1.6566,0,19.58,"0",0.871,6.122,97.3,1.618,5,403,14.7,372.8,14.1,21.5
1.49632,0,19.58,"0",0.871,5.404,100,1.5916,5,403,14.7,341.6,13.28,19.6
1.12658,0,19.58,"1",0.871,5.012,88,1.6102,5,403,14.7,343.28,12.12,15.3
2.14918,0,19.58,"0",0.871,5.709,98.5,1.6232,5,403,14.7,261.95,15.79,19.4
1.41385,0,19.58,"1",0.871,6.129,96,1.7494,5,403,14.7,321.02,15.12,17
3.53501,0,19.58,"1",0.871,6.152,82.6,1.7455,5,403,14.7,88.01,15.02,15.6
2.44668,0,19.58,"0",0.871,5.272,94,1.7364,5,403,14.7,88.63,16.14,13.1
1.22358,0,19.58,"0",0.605,6.943,97.4,1.8773,5,403,14.7,363.43,4.59,41.3
1.34284,0,19.58,"0",0.605,6.066,100,1.7573,5,403,14.7,353.89,6.43,24.3
1.42502,0,19.58,"0",0.871,6.51,100,1.7659,5,403,14.7,364.31,7.39,23.3
1.27346,0,19.58,"1",0.605,6.25,92.6,1.7984,5,403,14.7,338.92,5.5,27
1.46336,0,19.58,"0",0.605,7.489,90.8,1.9709,5,403,14.7,374.43,1.73,50
1.83377,0,19.58,"1",0.605,7.802,98.2,2.0407,5,403,14.7,389.61,1.92,50
1.51902,0,19.58,"1",0.605,8.375,93.9,2.162,5,403,14.7,388.45,3.32,50
2.24236,0,19.58,"0",0.605,5.854,91.8,2.422,5,403,14.7,395.11,11.64,22.7
2.924,0,19.58,"0",0.605,6.101,93,2.2834,5,403,14.7,240.16,9.81,25
2.01019,0,19.58,"0",0.605,7.929,96.2,2.0459,5,403,14.7,369.3,3.7,50
1.80028,0,19.58,"0",0.605,5.877,79.2,2.4259,5,403,14.7,227.61,12.14,23.8
2.3004,0,19.58,"0",0.605,6.319,96.1,2.1,5,403,14.7,297.09,11.1,23.8
2.44953,0,19.58,"0",0.605,6.402,95.2,2.2625,5,403,14.7,330.04,11.32,22.3
1.20742,0,19.58,"0",0.605,5.875,94.6,2.4259,5,403,14.7,292.29,14.43,17.4
2.3139,0,19.58,"0",0.605,5.88,97.3,2.3887,5,403,14.7,348.13,12.03,19.1
0.13914,0,4.05,"0",0.51,5.572,88.5,2.5961,5,296,16.6,396.9,14.69,23.1
0.09178,0,4.05,"0",0.51,6.416,84.1,2.6463,5,296,16.6,395.5,9.04,23.6
0.08447,0,4.05,"0",0.51,5.859,68.7,2.7019,5,296,16.6,393.23,9.64,22.6
0.06664,0,4.05,"0",0.51,6.546,33.1,3.1323,5,296,16.6,390.96,5.33,29.4
0.07022,0,4.05,"0",0.51,6.02,47.2,3.5549,5,296,16.6,393.23,10.11,23.2
0.05425,0,4.05,"0",0.51,6.315,73.4,3.3175,5,296,16.6,395.6,6.29,24.6
0.06642,0,4.05,"0",0.51,6.86,74.4,2.9153,5,296,16.6,391.27,6.92,29.9
0.0578,0,2.46,"0",0.488,6.98,58.4,2.829,3,193,17.8,396.9,5.04,37.2
0.06588,0,2.46,"0",0.488,7.765,83.3,2.741,3,193,17.8,395.56,7.56,39.8
0.06888,0,2.46,"0",0.488,6.144,62.2,2.5979,3,193,17.8,396.9,9.45,36.2
0.09103,0,2.46,"0",0.488,7.155,92.2,2.7006,3,193,17.8,394.12,4.82,37.9
0.10008,0,2.46,"0",0.488,6.563,95.6,2.847,3,193,17.8,396.9,5.68,32.5
0.08308,0,2.46,"0",0.488,5.604,89.8,2.9879,3,193,17.8,391,13.98,26.4
0.06047,0,2.46,"0",0.488,6.153,68.8,3.2797,3,193,17.8,387.11,13.15,29.6
0.05602,0,2.46,"0",0.488,7.831,53.6,3.1992,3,193,17.8,392.63,4.45,50
0.07875,45,3.44,"0",0.437,6.782,41.1,3.7886,5,398,15.2,393.87,6.68,32
0.12579,45,3.44,"0",0.437,6.556,29.1,4.5667,5,398,15.2,382.84,4.56,29.8
0.0837,45,3.44,"0",0.437,7.185,38.9,4.5667,5,398,15.2,396.9,5.39,34.9
0.09068,45,3.44,"0",0.437,6.951,21.5,6.4798,5,398,15.2,377.68,5.1,37
0.06911,45,3.44,"0",0.437,6.739,30.8,6.4798,5,398,15.2,389.71,4.69,30.5
0.08664,45,3.44,"0",0.437,7.178,26.3,6.4798,5,398,15.2,390.49,2.87,36.4
0.02187,60,2.93,"0",0.401,6.8,9.9,6.2196,1,265,15.6,393.37,5.03,31.1
0.01439,60,2.93,"0",0.401,6.604,18.8,6.2196,1,265,15.6,376.7,4.38,29.1
0.01381,80,0.46,"0",0.422,7.875,32,5.6484,4,255,14.4,394.23,2.97,50
0.04011,80,1.52,"0",0.404,7.287,34.1,7.309,2,329,12.6,396.9,4.08,33.3
0.04666,80,1.52,"0",0.404,7.107,36.6,7.309,2,329,12.6,354.31,8.61,30.3
0.03768,80,1.52,"0",0.404,7.274,38.3,7.309,2,329,12.6,392.2,6.62,34.6
0.0315,95,1.47,"0",0.403,6.975,15.3,7.6534,3,402,17,396.9,4.56,34.9
0.01778,95,1.47,"0",0.403,7.135,13.9,7.6534,3,402,17,384.3,4.45,32.9
0.03445,82.5,2.03,"0",0.415,6.162,38.4,6.27,2,348,14.7,393.77,7.43,24.1
0.02177,82.5,2.03,"0",0.415,7.61,15.7,6.27,2,348,14.7,395.38,3.11,42.3
0.0351,95,2.68,"0",0.4161,7.853,33.2,5.118,4,224,14.7,392.78,3.81,48.5
0.02009,95,2.68,"0",0.4161,8.034,31.9,5.118,4,224,14.7,390.55,2.88,50
0.13642,0,10.59,"0",0.489,5.891,22.3,3.9454,4,277,18.6,396.9,10.87,22.6
0.22969,0,10.59,"0",0.489,6.326,52.5,4.3549,4,277,18.6,394.87,10.97,24.4
0.25199,0,10.59,"0",0.489,5.783,72.7,4.3549,4,277,18.6,389.43,18.06,22.5
0.13587,0,10.59,"1",0.489,6.064,59.1,4.2392,4,277,18.6,381.32,14.66,24.4
0.43571,0,10.59,"1",0.489,5.344,100,3.875,4,277,18.6,396.9,23.09,20
0.17446,0,10.59,"1",0.489,5.96,92.1,3.8771,4,277,18.6,393.25,17.27,21.7
0.37578,0,10.59,"1",0.489,5.404,88.6,3.665,4,277,18.6,395.24,23.98,19.3
0.21719,0,10.59,"1",0.489,5.807,53.8,3.6526,4,277,18.6,390.94,16.03,22.4
0.14052,0,10.59,"0",0.489,6.375,32.3,3.9454,4,277,18.6,385.81,9.38,28.1
0.28955,0,10.59,"0",0.489,5.412,9.8,3.5875,4,277,18.6,348.93,29.55,23.7
0.19802,0,10.59,"0",0.489,6.182,42.4,3.9454,4,277,18.6,393.63,9.47,25
0.0456,0,13.89,"1",0.55,5.888,56,3.1121,5,276,16.4,392.8,13.51,23.3
0.07013,0,13.89,"0",0.55,6.642,85.1,3.4211,5,276,16.4,392.78,9.69,28.7
0.11069,0,13.89,"1",0.55,5.951,93.8,2.8893,5,276,16.4,396.9,17.92,21.5
0.11425,0,13.89,"1",0.55,6.373,92.4,3.3633,5,276,16.4,393.74,10.5,23
0.35809,0,6.2,"1",0.507,6.951,88.5,2.8617,8,307,17.4,391.7,9.71,26.7
0.40771,0,6.2,"1",0.507,6.164,91.3,3.048,8,307,17.4,395.24,21.46,21.7
0.62356,0,6.2,"1",0.507,6.879,77.7,3.2721,8,307,17.4,390.39,9.93,27.5
0.6147,0,6.2,"0",0.507,6.618,80.8,3.2721,8,307,17.4,396.9,7.6,30.1
0.31533,0,6.2,"0",0.504,8.266,78.3,2.8944,8,307,17.4,385.05,4.14,44.8
0.52693,0,6.2,"0",0.504,8.725,83,2.8944,8,307,17.4,382,4.63,50
0.38214,0,6.2,"0",0.504,8.04,86.5,3.2157,8,307,17.4,387.38,3.13,37.6
0.41238,0,6.2,"0",0.504,7.163,79.9,3.2157,8,307,17.4,372.08,6.36,31.6
0.29819,0,6.2,"0",0.504,7.686,17,3.3751,8,307,17.4,377.51,3.92,46.7
0.44178,0,6.2,"0",0.504,6.552,21.4,3.3751,8,307,17.4,380.34,3.76,31.5
0.537,0,6.2,"0",0.504,5.981,68.1,3.6715,8,307,17.4,378.35,11.65,24.3
0.46296,0,6.2,"0",0.504,7.412,76.9,3.6715,8,307,17.4,376.14,5.25,31.7
0.57529,0,6.2,"0",0.507,8.337,73.3,3.8384,8,307,17.4,385.91,2.47,41.7
0.33147,0,6.2,"0",0.507,8.247,70.4,3.6519,8,307,17.4,378.95,3.95,48.3
0.44791,0,6.2,"1",0.507,6.726,66.5,3.6519,8,307,17.4,360.2,8.05,29
0.33045,0,6.2,"0",0.507,6.086,61.5,3.6519,8,307,17.4,376.75,10.88,24
0.52058,0,6.2,"1",0.507,6.631,76.5,4.148,8,307,17.4,388.45,9.54,25.1
0.51183,0,6.2,"0",0.507,7.358,71.6,4.148,8,307,17.4,390.07,4.73,31.5
0.08244,30,4.93,"0",0.428,6.481,18.5,6.1899,6,300,16.6,379.41,6.36,23.7
0.09252,30,4.93,"0",0.428,6.606,42.2,6.1899,6,300,16.6,383.78,7.37,23.3
0.11329,30,4.93,"0",0.428,6.897,54.3,6.3361,6,300,16.6,391.25,11.38,22
0.10612,30,4.93,"0",0.428,6.095,65.1,6.3361,6,300,16.6,394.62,12.4,20.1
0.1029,30,4.93,"0",0.428,6.358,52.9,7.0355,6,300,16.6,372.75,11.22,22.2
0.12757,30,4.93,"0",0.428,6.393,7.8,7.0355,6,300,16.6,374.71,5.19,23.7
0.20608,22,5.86,"0",0.431,5.593,76.5,7.9549,7,330,19.1,372.49,12.5,17.6
0.19133,22,5.86,"0",0.431,5.605,70.2,7.9549,7,330,19.1,389.13,18.46,18.5
0.33983,22,5.86,"0",0.431,6.108,34.9,8.0555,7,330,19.1,390.18,9.16,24.3
0.19657,22,5.86,"0",0.431,6.226,79.2,8.0555,7,330,19.1,376.14,10.15,20.5
0.16439,22,5.86,"0",0.431,6.433,49.1,7.8265,7,330,19.1,374.71,9.52,24.5
0.19073,22,5.86,"0",0.431,6.718,17.5,7.8265,7,330,19.1,393.74,6.56,26.2
0.1403,22,5.86,"0",0.431,6.487,13,7.3967,7,330,19.1,396.28,5.9,24.4
0.21409,22,5.86,"0",0.431,6.438,8.9,7.3967,7,330,19.1,377.07,3.59,24.8
0.08221,22,5.86,"0",0.431,6.957,6.8,8.9067,7,330,19.1,386.09,3.53,29.6
0.36894,22,5.86,"0",0.431,8.259,8.4,8.9067,7,330,19.1,396.9,3.54,42.8
0.04819,80,3.64,"0",0.392,6.108,32,9.2203,1,315,16.4,392.89,6.57,21.9
0.03548,80,3.64,"0",0.392,5.876,19.1,9.2203,1,315,16.4,395.18,9.25,20.9
0.01538,90,3.75,"0",0.394,7.454,34.2,6.3361,3,244,15.9,386.34,3.11,44
0.61154,20,3.97,"0",0.647,8.704,86.9,1.801,5,264,13,389.7,5.12,50
0.66351,20,3.97,"0",0.647,7.333,100,1.8946,5,264,13,383.29,7.79,36
0.65665,20,3.97,"0",0.647,6.842,100,2.0107,5,264,13,391.93,6.9,30.1
0.54011,20,3.97,"0",0.647,7.203,81.8,2.1121,5,264,13,392.8,9.59,33.8
0.53412,20,3.97,"0",0.647,7.52,89.4,2.1398,5,264,13,388.37,7.26,43.1
0.52014,20,3.97,"0",0.647,8.398,91.5,2.2885,5,264,13,386.86,5.91,48.8
0.82526,20,3.97,"0",0.647,7.327,94.5,2.0788,5,264,13,393.42,11.25,31
0.55007,20,3.97,"0",0.647,7.206,91.6,1.9301,5,264,13,387.89,8.1,36.5
0.76162,20,3.97,"0",0.647,5.56,62.8,1.9865,5,264,13,392.4,10.45,22.8
0.7857,20,3.97,"0",0.647,7.014,84.6,2.1329,5,264,13,384.07,14.79,30.7
0.57834,20,3.97,"0",0.575,8.297,67,2.4216,5,264,13,384.54,7.44,50
0.5405,20,3.97,"0",0.575,7.47,52.6,2.872,5,264,13,390.3,3.16,43.5
0.09065,20,6.96,"1",0.464,5.92,61.5,3.9175,3,223,18.6,391.34,13.65,20.7
0.29916,20,6.96,"0",0.464,5.856,42.1,4.429,3,223,18.6,388.65,13,21.1
0.16211,20,6.96,"0",0.464,6.24,16.3,4.429,3,223,18.6,396.9,6.59,25.2
0.1146,20,6.96,"0",0.464,6.538,58.7,3.9175,3,223,18.6,394.96,7.73,24.4
0.22188,20,6.96,"1",0.464,7.691,51.8,4.3665,3,223,18.6,390.77,6.58,35.2
0.05644,40,6.41,"1",0.447,6.758,32.9,4.0776,4,254,17.6,396.9,3.53,32.4
0.09604,40,6.41,"0",0.447,6.854,42.8,4.2673,4,254,17.6,396.9,2.98,32
0.10469,40,6.41,"1",0.447,7.267,49,4.7872,4,254,17.6,389.25,6.05,33.2
0.06127,40,6.41,"1",0.447,6.826,27.6,4.8628,4,254,17.6,393.45,4.16,33.1
0.07978,40,6.41,"0",0.447,6.482,32.1,4.1403,4,254,17.6,396.9,7.19,29.1
0.21038,20,3.33,"0",0.4429,6.812,32.2,4.1007,5,216,14.9,396.9,4.85,35.1
0.03578,20,3.33,"0",0.4429,7.82,64.5,4.6947,5,216,14.9,387.31,3.76,45.4
0.03705,20,3.33,"0",0.4429,6.968,37.2,5.2447,5,216,14.9,392.23,4.59,35.4
0.06129,20,3.33,"1",0.4429,7.645,49.7,5.2119,5,216,14.9,377.07,3.01,46
0.01501,90,1.21,"1",0.401,7.923,24.8,5.885,1,198,13.6,395.52,3.16,50
0.00906,90,2.97,"0",0.4,7.088,20.8,7.3073,1,285,15.3,394.72,7.85,32.2
0.01096,55,2.25,"0",0.389,6.453,31.9,7.3073,1,300,15.3,394.72,8.23,22
0.01965,80,1.76,"0",0.385,6.23,31.5,9.0892,1,241,18.2,341.6,12.93,20.1
0.03871,52.5,5.32,"0",0.405,6.209,31.3,7.3172,6,293,16.6,396.9,7.14,23.2
0.0459,52.5,5.32,"0",0.405,6.315,45.6,7.3172,6,293,16.6,396.9,7.6,22.3
0.04297,52.5,5.32,"0",0.405,6.565,22.9,7.3172,6,293,16.6,371.72,9.51,24.8
0.03502,80,4.95,"0",0.411,6.861,27.9,5.1167,4,245,19.2,396.9,3.33,28.5
0.07886,80,4.95,"0",0.411,7.148,27.7,5.1167,4,245,19.2,396.9,3.56,37.3
0.03615,80,4.95,"0",0.411,6.63,23.4,5.1167,4,245,19.2,396.9,4.7,27.9
0.08265,0,13.92,"0",0.437,6.127,18.4,5.5027,4,289,16,396.9,8.58,23.9
0.08199,0,13.92,"0",0.437,6.009,42.3,5.5027,4,289,16,396.9,10.4,21.7
0.12932,0,13.92,"0",0.437,6.678,31.1,5.9604,4,289,16,396.9,6.27,28.6
0.05372,0,13.92,"0",0.437,6.549,51,5.9604,4,289,16,392.85,7.39,27.1
0.14103,0,13.92,"0",0.437,5.79,58,6.32,4,289,16,396.9,15.84,20.3
0.06466,70,2.24,"0",0.4,6.345,20.1,7.8278,5,358,14.8,368.24,4.97,22.5
0.05561,70,2.24,"0",0.4,7.041,10,7.8278,5,358,14.8,371.58,4.74,29
0.04417,70,2.24,"0",0.4,6.871,47.4,7.8278,5,358,14.8,390.86,6.07,24.8
0.03537,34,6.09,"0",0.433,6.59,40.4,5.4917,7,329,16.1,395.75,9.5,22
0.09266,34,6.09,"0",0.433,6.495,18.4,5.4917,7,329,16.1,383.61,8.67,26.4
0.1,34,6.09,"0",0.433,6.982,17.7,5.4917,7,329,16.1,390.43,4.86,33.1
0.05515,33,2.18,"0",0.472,7.236,41.1,4.022,7,222,18.4,393.68,6.93,36.1
0.05479,33,2.18,"0",0.472,6.616,58.1,3.37,7,222,18.4,393.36,8.93,28.4
0.07503,33,2.18,"0",0.472,7.42,71.9,3.0992,7,222,18.4,396.9,6.47,33.4
0.04932,33,2.18,"0",0.472,6.849,70.3,3.1827,7,222,18.4,396.9,7.53,28.2
0.49298,0,9.9,"0",0.544,6.635,82.5,3.3175,4,304,18.4,396.9,4.54,22.8
0.3494,0,9.9,"0",0.544,5.972,76.7,3.1025,4,304,18.4,396.24,9.97,20.3
2.63548,0,9.9,"0",0.544,4.973,37.8,2.5194,4,304,18.4,350.45,12.64,16.1
0.79041,0,9.9,"0",0.544,6.122,52.8,2.6403,4,304,18.4,396.9,5.98,22.1
0.26169,0,9.9,"0",0.544,6.023,90.4,2.834,4,304,18.4,396.3,11.72,19.4
0.26938,0,9.9,"0",0.544,6.266,82.8,3.2628,4,304,18.4,393.39,7.9,21.6
0.3692,0,9.9,"0",0.544,6.567,87.3,3.6023,4,304,18.4,395.69,9.28,23.8
0.25356,0,9.9,"0",0.544,5.705,77.7,3.945,4,304,18.4,396.42,11.5,16.2
0.31827,0,9.9,"0",0.544,5.914,83.2,3.9986,4,304,18.4,390.7,18.33,17.8
0.24522,0,9.9,"0",0.544,5.782,71.7,4.0317,4,304,18.4,396.9,15.94,19.8
0.40202,0,9.9,"0",0.544,6.382,67.2,3.5325,4,304,18.4,395.21,10.36,23.1
0.47547,0,9.9,"0",0.544,6.113,58.8,4.0019,4,304,18.4,396.23,12.73,21
0.1676,0,7.38,"0",0.493,6.426,52.3,4.5404,5,287,19.6,396.9,7.2,23.8
0.18159,0,7.38,"0",0.493,6.376,54.3,4.5404,5,287,19.6,396.9,6.87,23.1
0.35114,0,7.38,"0",0.493,6.041,49.9,4.7211,5,287,19.6,396.9,7.7,20.4
0.28392,0,7.38,"0",0.493,5.708,74.3,4.7211,5,287,19.6,391.13,11.74,18.5
0.34109,0,7.38,"0",0.493,6.415,40.1,4.7211,5,287,19.6,396.9,6.12,25
0.19186,0,7.38,"0",0.493,6.431,14.7,5.4159,5,287,19.6,393.68,5.08,24.6
0.30347,0,7.38,"0",0.493,6.312,28.9,5.4159,5,287,19.6,396.9,6.15,23
0.24103,0,7.38,"0",0.493,6.083,43.7,5.4159,5,287,19.6,396.9,12.79,22.2
0.06617,0,3.24,"0",0.46,5.868,25.8,5.2146,4,430,16.9,382.44,9.97,19.3
0.06724,0,3.24,"0",0.46,6.333,17.2,5.2146,4,430,16.9,375.21,7.34,22.6
0.04544,0,3.24,"0",0.46,6.144,32.2,5.8736,4,430,16.9,368.57,9.09,19.8
0.05023,35,6.06,"0",0.4379,5.706,28.4,6.6407,1,304,16.9,394.02,12.43,17.1
0.03466,35,6.06,"0",0.4379,6.031,23.3,6.6407,1,304,16.9,362.25,7.83,19.4
0.05083,0,5.19,"0",0.515,6.316,38.1,6.4584,5,224,20.2,389.71,5.68,22.2
0.03738,0,5.19,"0",0.515,6.31,38.5,6.4584,5,224,20.2,389.4,6.75,20.7
0.03961,0,5.19,"0",0.515,6.037,34.5,5.9853,5,224,20.2,396.9,8.01,21.1
0.03427,0,5.19,"0",0.515,5.869,46.3,5.2311,5,224,20.2,396.9,9.8,19.5
0.03041,0,5.19,"0",0.515,5.895,59.6,5.615,5,224,20.2,394.81,10.56,18.5
0.03306,0,5.19,"0",0.515,6.059,37.3,4.8122,5,224,20.2,396.14,8.51,20.6
0.05497,0,5.19,"0",0.515,5.985,45.4,4.8122,5,224,20.2,396.9,9.74,19
0.06151,0,5.19,"0",0.515,5.968,58.5,4.8122,5,224,20.2,396.9,9.29,18.7
0.01301,35,1.52,"0",0.442,7.241,49.3,7.0379,1,284,15.5,394.74,5.49,32.7
0.02498,0,1.89,"0",0.518,6.54,59.7,6.2669,1,422,15.9,389.96,8.65,16.5
0.02543,55,3.78,"0",0.484,6.696,56.4,5.7321,5,370,17.6,396.9,7.18,23.9
0.03049,55,3.78,"0",0.484,6.874,28.1,6.4654,5,370,17.6,387.97,4.61,31.2
0.03113,0,4.39,"0",0.442,6.014,48.5,8.0136,3,352,18.8,385.64,10.53,17.5
0.06162,0,4.39,"0",0.442,5.898,52.3,8.0136,3,352,18.8,364.61,12.67,17.2
0.0187,85,4.15,"0",0.429,6.516,27.7,8.5353,4,351,17.9,392.43,6.36,23.1
0.01501,80,2.01,"0",0.435,6.635,29.7,8.344,4,280,17,390.94,5.99,24.5
0.02899,40,1.25,"0",0.429,6.939,34.5,8.7921,1,335,19.7,389.85,5.89,26.6
0.06211,40,1.25,"0",0.429,6.49,44.4,8.7921,1,335,19.7,396.9,5.98,22.9
0.0795,60,1.69,"0",0.411,6.579,35.9,10.7103,4,411,18.3,370.78,5.49,24.1
0.07244,60,1.69,"0",0.411,5.884,18.5,10.7103,4,411,18.3,392.33,7.79,18.6
0.01709,90,2.02,"0",0.41,6.728,36.1,12.1265,5,187,17,384.46,4.5,30.1
0.04301,80,1.91,"0",0.413,5.663,21.9,10.5857,4,334,22,382.8,8.05,18.2
0.10659,80,1.91,"0",0.413,5.936,19.5,10.5857,4,334,22,376.04,5.57,20.6
8.98296,0,18.1,"1",0.77,6.212,97.4,2.1222,24,666,20.2,377.73,17.6,17.8
3.8497,0,18.1,"1",0.77,6.395,91,2.5052,24,666,20.2,391.34,13.27,21.7
5.20177,0,18.1,"1",0.77,6.127,83.4,2.7227,24,666,20.2,395.43,11.48,22.7
4.26131,0,18.1,"0",0.77,6.112,81.3,2.5091,24,666,20.2,390.74,12.67,22.6
4.54192,0,18.1,"0",0.77,6.398,88,2.5182,24,666,20.2,374.56,7.79,25
3.83684,0,18.1,"0",0.77,6.251,91.1,2.2955,24,666,20.2,350.65,14.19,19.9
3.67822,0,18.1,"0",0.77,5.362,96.2,2.1036,24,666,20.2,380.79,10.19,20.8
4.22239,0,18.1,"1",0.77,5.803,89,1.9047,24,666,20.2,353.04,14.64,16.8
3.47428,0,18.1,"1",0.718,8.78,82.9,1.9047,24,666,20.2,354.55,5.29,21.9
4.55587,0,18.1,"0",0.718,3.561,87.9,1.6132,24,666,20.2,354.7,7.12,27.5
3.69695,0,18.1,"0",0.718,4.963,91.4,1.7523,24,666,20.2,316.03,14,21.9
13.5222,0,18.1,"0",0.631,3.863,100,1.5106,24,666,20.2,131.42,13.33,23.1
4.89822,0,18.1,"0",0.631,4.97,100,1.3325,24,666,20.2,375.52,3.26,50
5.66998,0,18.1,"1",0.631,6.683,96.8,1.3567,24,666,20.2,375.33,3.73,50
6.53876,0,18.1,"1",0.631,7.016,97.5,1.2024,24,666,20.2,392.05,2.96,50
9.2323,0,18.1,"0",0.631,6.216,100,1.1691,24,666,20.2,366.15,9.53,50
8.26725,0,18.1,"1",0.668,5.875,89.6,1.1296,24,666,20.2,347.88,8.88,50
11.1081,0,18.1,"0",0.668,4.906,100,1.1742,24,666,20.2,396.9,34.77,13.8
18.4982,0,18.1,"0",0.668,4.138,100,1.137,24,666,20.2,396.9,37.97,13.8
19.6091,0,18.1,"0",0.671,7.313,97.9,1.3163,24,666,20.2,396.9,13.44,15
15.288,0,18.1,"0",0.671,6.649,93.3,1.3449,24,666,20.2,363.02,23.24,13.9
9.82349,0,18.1,"0",0.671,6.794,98.8,1.358,24,666,20.2,396.9,21.24,13.3
23.6482,0,18.1,"0",0.671,6.38,96.2,1.3861,24,666,20.2,396.9,23.69,13.1
17.8667,0,18.1,"0",0.671,6.223,100,1.3861,24,666,20.2,393.74,21.78,10.2
88.9762,0,18.1,"0",0.671,6.968,91.9,1.4165,24,666,20.2,396.9,17.21,10.4
15.8744,0,18.1,"0",0.671,6.545,99.1,1.5192,24,666,20.2,396.9,21.08,10.9
9.18702,0,18.1,"0",0.7,5.536,100,1.5804,24,666,20.2,396.9,23.6,11.3
7.99248,0,18.1,"0",0.7,5.52,100,1.5331,24,666,20.2,396.9,24.56,12.3
20.0849,0,18.1,"0",0.7,4.368,91.2,1.4395,24,666,20.2,285.83,30.63,8.8
16.8118,0,18.1,"0",0.7,5.277,98.1,1.4261,24,666,20.2,396.9,30.81,7.2
24.3938,0,18.1,"0",0.7,4.652,100,1.4672,24,666,20.2,396.9,28.28,10.5
22.5971,0,18.1,"0",0.7,5,89.5,1.5184,24,666,20.2,396.9,31.99,7.4
14.3337,0,18.1,"0",0.7,4.88,100,1.5895,24,666,20.2,372.92,30.62,10.2
8.15174,0,18.1,"0",0.7,5.39,98.9,1.7281,24,666,20.2,396.9,20.85,11.5
6.96215,0,18.1,"0",0.7,5.713,97,1.9265,24,666,20.2,394.43,17.11,15.1
5.29305,0,18.1,"0",0.7,6.051,82.5,2.1678,24,666,20.2,378.38,18.76,23.2
11.5779,0,18.1,"0",0.7,5.036,97,1.77,24,666,20.2,396.9,25.68,9.7
8.64476,0,18.1,"0",0.693,6.193,92.6,1.7912,24,666,20.2,396.9,15.17,13.8
13.3598,0,18.1,"0",0.693,5.887,94.7,1.7821,24,666,20.2,396.9,16.35,12.7
8.71675,0,18.1,"0",0.693,6.471,98.8,1.7257,24,666,20.2,391.98,17.12,13.1
5.87205,0,18.1,"0",0.693,6.405,96,1.6768,24,666,20.2,396.9,19.37,12.5
7.67202,0,18.1,"0",0.693,5.747,98.9,1.6334,24,666,20.2,393.1,19.92,8.5
38.3518,0,18.1,"0",0.693,5.453,100,1.4896,24,666,20.2,396.9,30.59,5
9.91655,0,18.1,"0",0.693,5.852,77.8,1.5004,24,666,20.2,338.16,29.97,6.3
25.0461,0,18.1,"0",0.693,5.987,100,1.5888,24,666,20.2,396.9,26.77,5.6
14.2362,0,18.1,"0",0.693,6.343,100,1.5741,24,666,20.2,396.9,20.32,7.2
9.59571,0,18.1,"0",0.693,6.404,100,1.639,24,666,20.2,376.11,20.31,12.1
24.8017,0,18.1,"0",0.693,5.349,96,1.7028,24,666,20.2,396.9,19.77,8.3
41.5292,0,18.1,"0",0.693,5.531,85.4,1.6074,24,666,20.2,329.46,27.38,8.5
67.9208,0,18.1,"0",0.693,5.683,100,1.4254,24,666,20.2,384.97,22.98,5
20.7162,0,18.1,"0",0.659,4.138,100,1.1781,24,666,20.2,370.22,23.34,11.9
11.9511,0,18.1,"0",0.659,5.608,100,1.2852,24,666,20.2,332.09,12.13,27.9
7.40389,0,18.1,"0",0.597,5.617,97.9,1.4547,24,666,20.2,314.64,26.4,17.2
14.4383,0,18.1,"0",0.597,6.852,100,1.4655,24,666,20.2,179.36,19.78,27.5
51.1358,0,18.1,"0",0.597,5.757,100,1.413,24,666,20.2,2.6,10.11,15
14.0507,0,18.1,"0",0.597,6.657,100,1.5275,24,666,20.2,35.05,21.22,17.2
18.811,0,18.1,"0",0.597,4.628,100,1.5539,24,666,20.2,28.79,34.37,17.9
28.6558,0,18.1,"0",0.597,5.155,100,1.5894,24,666,20.2,210.97,20.08,16.3
45.7461,0,18.1,"0",0.693,4.519,100,1.6582,24,666,20.2,88.27,36.98,7
18.0846,0,18.1,"0",0.679,6.434,100,1.8347,24,666,20.2,27.25,29.05,7.2
10.8342,0,18.1,"0",0.679,6.782,90.8,1.8195,24,666,20.2,21.57,25.79,7.5
25.9406,0,18.1,"0",0.679,5.304,89.1,1.6475,24,666,20.2,127.36,26.64,10.4
73.5341,0,18.1,"0",0.679,5.957,100,1.8026,24,666,20.2,16.45,20.62,8.8
11.8123,0,18.1,"0",0.718,6.824,76.5,1.794,24,666,20.2,48.45,22.74,8.4
11.0874,0,18.1,"0",0.718,6.411,100,1.8589,24,666,20.2,318.75,15.02,16.7
7.02259,0,18.1,"0",0.718,6.006,95.3,1.8746,24,666,20.2,319.98,15.7,14.2
12.0482,0,18.1,"0",0.614,5.648,87.6,1.9512,24,666,20.2,291.55,14.1,20.8
7.05042,0,18.1,"0",0.614,6.103,85.1,2.0218,24,666,20.2,2.52,23.29,13.4
8.79212,0,18.1,"0",0.584,5.565,70.6,2.0635,24,666,20.2,3.65,17.16,11.7
15.8603,0,18.1,"0",0.679,5.896,95.4,1.9096,24,666,20.2,7.68,24.39,8.3
12.2472,0,18.1,"0",0.584,5.837,59.7,1.9976,24,666,20.2,24.65,15.69,10.2
37.6619,0,18.1,"0",0.679,6.202,78.7,1.8629,24,666,20.2,18.82,14.52,10.9
7.36711,0,18.1,"0",0.679,6.193,78.1,1.9356,24,666,20.2,96.73,21.52,11
9.33889,0,18.1,"0",0.679,6.38,95.6,1.9682,24,666,20.2,60.72,24.08,9.5
8.49213,0,18.1,"0",0.584,6.348,86.1,2.0527,24,666,20.2,83.45,17.64,14.5
10.0623,0,18.1,"0",0.584,6.833,94.3,2.0882,24,666,20.2,81.33,19.69,14.1
6.44405,0,18.1,"0",0.584,6.425,74.8,2.2004,24,666,20.2,97.95,12.03,16.1
5.58107,0,18.1,"0",0.713,6.436,87.9,2.3158,24,666,20.2,100.19,16.22,14.3
13.9134,0,18.1,"0",0.713,6.208,95,2.2222,24,666,20.2,100.63,15.17,11.7
11.1604,0,18.1,"0",0.74,6.629,94.6,2.1247,24,666,20.2,109.85,23.27,13.4
14.4208,0,18.1,"0",0.74,6.461,93.3,2.0026,24,666,20.2,27.49,18.05,9.6
15.1772,0,18.1,"0",0.74,6.152,100,1.9142,24,666,20.2,9.32,26.45,8.7
13.6781,0,18.1,"0",0.74,5.935,87.9,1.8206,24,666,20.2,68.95,34.02,8.4
9.39063,0,18.1,"0",0.74,5.627,93.9,1.8172,24,666,20.2,396.9,22.88,12.8
22.0511,0,18.1,"0",0.74,5.818,92.4,1.8662,24,666,20.2,391.45,22.11,10.5
9.72418,0,18.1,"0",0.74,6.406,97.2,2.0651,24,666,20.2,385.96,19.52,17.1
5.66637,0,18.1,"0",0.74,6.219,100,2.0048,24,666,20.2,395.69,16.59,18.4
9.96654,0,18.1,"0",0.74,6.485,100,1.9784,24,666,20.2,386.73,18.85,15.4
12.8023,0,18.1,"0",0.74,5.854,96.6,1.8956,24,666,20.2,240.52,23.79,10.8
10.6718,0,18.1,"0",0.74,6.459,94.8,1.9879,24,666,20.2,43.06,23.98,11.8
6.28807,0,18.1,"0",0.74,6.341,96.4,2.072,24,666,20.2,318.01,17.79,14.9
9.92485,0,18.1,"0",0.74,6.251,96.6,2.198,24,666,20.2,388.52,16.44,12.6
9.32909,0,18.1,"0",0.713,6.185,98.7,2.2616,24,666,20.2,396.9,18.13,14.1
7.52601,0,18.1,"0",0.713,6.417,98.3,2.185,24,666,20.2,304.21,19.31,13
6.71772,0,18.1,"0",0.713,6.749,92.6,2.3236,24,666,20.2,0.32,17.44,13.4
5.44114,0,18.1,"0",0.713,6.655,98.2,2.3552,24,666,20.2,355.29,17.73,15.2
5.09017,0,18.1,"0",0.713,6.297,91.8,2.3682,24,666,20.2,385.09,17.27,16.1
8.24809,0,18.1,"0",0.713,7.393,99.3,2.4527,24,666,20.2,375.87,16.74,17.8
9.51363,0,18.1,"0",0.713,6.728,94.1,2.4961,24,666,20.2,6.68,18.71,14.9
4.75237,0,18.1,"0",0.713,6.525,86.5,2.4358,24,666,20.2,50.92,18.13,14.1
4.66883,0,18.1,"0",0.713,5.976,87.9,2.5806,24,666,20.2,10.48,19.01,12.7
8.20058,0,18.1,"0",0.713,5.936,80.3,2.7792,24,666,20.2,3.5,16.94,13.5
7.75223,0,18.1,"0",0.713,6.301,83.7,2.7831,24,666,20.2,272.21,16.23,14.9
6.80117,0,18.1,"0",0.713,6.081,84.4,2.7175,24,666,20.2,396.9,14.7,20
4.81213,0,18.1,"0",0.713,6.701,90,2.5975,24,666,20.2,255.23,16.42,16.4
3.69311,0,18.1,"0",0.713,6.376,88.4,2.5671,24,666,20.2,391.43,14.65,17.7
6.65492,0,18.1,"0",0.713,6.317,83,2.7344,24,666,20.2,396.9,13.99,19.5
5.82115,0,18.1,"0",0.713,6.513,89.9,2.8016,24,666,20.2,393.82,10.29,20.2
7.83932,0,18.1,"0",0.655,6.209,65.4,2.9634,24,666,20.2,396.9,13.22,21.4
3.1636,0,18.1,"0",0.655,5.759,48.2,3.0665,24,666,20.2,334.4,14.13,19.9
3.77498,0,18.1,"0",0.655,5.952,84.7,2.8715,24,666,20.2,22.01,17.15,19
4.42228,0,18.1,"0",0.584,6.003,94.5,2.5403,24,666,20.2,331.29,21.32,19.1
15.5757,0,18.1,"0",0.58,5.926,71,2.9084,24,666,20.2,368.74,18.13,19.1
13.0751,0,18.1,"0",0.58,5.713,56.7,2.8237,24,666,20.2,396.9,14.76,20.1
4.34879,0,18.1,"0",0.58,6.167,84,3.0334,24,666,20.2,396.9,16.29,19.9
4.03841,0,18.1,"0",0.532,6.229,90.7,3.0993,24,666,20.2,395.33,12.87,19.6
3.56868,0,18.1,"0",0.58,6.437,75,2.8965,24,666,20.2,393.37,14.36,23.2
4.64689,0,18.1,"0",0.614,6.98,67.6,2.5329,24,666,20.2,374.68,11.66,29.8
8.05579,0,18.1,"0",0.584,5.427,95.4,2.4298,24,666,20.2,352.58,18.14,13.8
6.39312,0,18.1,"0",0.584,6.162,97.4,2.206,24,666,20.2,302.76,24.1,13.3
4.87141,0,18.1,"0",0.614,6.484,93.6,2.3053,24,666,20.2,396.21,18.68,16.7
15.0234,0,18.1,"0",0.614,5.304,97.3,2.1007,24,666,20.2,349.48,24.91,12
10.233,0,18.1,"0",0.614,6.185,96.7,2.1705,24,666,20.2,379.7,18.03,14.6
14.3337,0,18.1,"0",0.614,6.229,88,1.9512,24,666,20.2,383.32,13.11,21.4
5.82401,0,18.1,"0",0.532,6.242,64.7,3.4242,24,666,20.2,396.9,10.74,23
5.70818,0,18.1,"0",0.532,6.75,74.9,3.3317,24,666,20.2,393.07,7.74,23.7
5.73116,0,18.1,"0",0.532,7.061,77,3.4106,24,666,20.2,395.28,7.01,25
2.81838,0,18.1,"0",0.532,5.762,40.3,4.0983,24,666,20.2,392.92,10.42,21.8
2.37857,0,18.1,"0",0.583,5.871,41.9,3.724,24,666,20.2,370.73,13.34,20.6
3.67367,0,18.1,"0",0.583,6.312,51.9,3.9917,24,666,20.2,388.62,10.58,21.2
5.69175,0,18.1,"0",0.583,6.114,79.8,3.5459,24,666,20.2,392.68,14.98,19.1
4.83567,0,18.1,"0",0.583,5.905,53.2,3.1523,24,666,20.2,388.22,11.45,20.6
0.15086,0,27.74,"0",0.609,5.454,92.7,1.8209,4,711,20.1,395.09,18.06,15.2
0.18337,0,27.74,"0",0.609,5.414,98.3,1.7554,4,711,20.1,344.05,23.97,7
0.20746,0,27.74,"0",0.609,5.093,98,1.8226,4,711,20.1,318.43,29.68,8.1
0.10574,0,27.74,"0",0.609,5.983,98.8,1.8681,4,711,20.1,390.11,18.07,13.6
0.11132,0,27.74,"0",0.609,5.983,83.5,2.1099,4,711,20.1,396.9,13.35,20.1
0.17331,0,9.69,"0",0.585,5.707,54,2.3817,6,391,19.2,396.9,12.01,21.8
0.27957,0,9.69,"0",0.585,5.926,42.6,2.3817,6,391,19.2,396.9,13.59,24.5
0.17899,0,9.69,"0",0.585,5.67,28.8,2.7986,6,391,19.2,393.29,17.6,23.1
0.2896,0,9.69,"0",0.585,5.39,72.9,2.7986,6,391,19.2,396.9,21.14,19.7
0.26838,0,9.69,"0",0.585,5.794,70.6,2.8927,6,391,19.2,396.9,14.1,18.3
0.23912,0,9.69,"0",0.585,6.019,65.3,2.4091,6,391,19.2,396.9,12.92,21.2
0.17783,0,9.69,"0",0.585,5.569,73.5,2.3999,6,391,19.2,395.77,15.1,17.5
0.22438,0,9.69,"0",0.585,6.027,79.7,2.4982,6,391,19.2,396.9,14.33,16.8
0.06263,0,11.93,"0",0.573,6.593,69.1,2.4786,1,273,21,391.99,9.67,22.4
0.04527,0,11.93,"0",0.573,6.12,76.7,2.2875,1,273,21,396.9,9.08,20.6
0.06076,0,11.93,"0",0.573,6.976,91,2.1675,1,273,21,396.9,5.64,23.9
0.10959,0,11.93,"0",0.573,6.794,89.3,2.3889,1,273,21,393.45,6.48,22
0.04741,0,11.93,"0",0.573,6.03,80.8,2.505,1,273,21,396.9,7.88,11.9
source diff could not be displayed: it is too large. Options to address this: view the blob.
%% Cell type:markdown id: tags:
German Traffic Sign Recognition Benchmark (GTSRB)
=================================================
---
Introduction au Deep Learning (IDLE) - S. Arias, E. Maldonado, JL. Parouty - CNRS/SARI/DEVLOG - 2020
## Episode 2 : First Convolutions
Our main steps:
- Read H5 dataset
- Build a model
- Train the model
- Evaluate the model
## 1/ Import and init
%% Cell type:code id: tags:
``` python
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras.callbacks import TensorBoard
import numpy as np
import matplotlib.pyplot as plt
import h5py
import os,time,sys
from importlib import reload
sys.path.append('..')
import fidle.pwk as ooo
ooo.init()
```
%% Cell type:markdown id: tags:
## 2/ Load dataset
We're going to retrieve a previously recorded dataset.
For example: set-24x24-L
%% Cell type:code id: tags:
``` python
%%time
def read_dataset(name):
'''Reads h5 dataset from ./data
Arguments: dataset name, without .h5
Returns: x_train,y_train,x_test,y_test data'''
# ---- Read dataset
filename='./data/'+name+'.h5'
with h5py.File(filename) as f:
x_train = f['x_train'][:]
y_train = f['y_train'][:]
x_test = f['x_test'][:]
y_test = f['y_test'][:]
# ---- done
print('Dataset "{}" is loaded. ({:.1f} Mo)\n'.format(name,os.path.getsize(filename)/(1024*1024)))
return x_train,y_train,x_test,y_test
x_train,y_train,x_test,y_test = read_dataset('set-24x24-L')
```
%% Cell type:markdown id: tags:
## 3/ Have a look to the dataset
We take a quick look as we go by...
%% Cell type:code id: tags:
``` python
print("x_train : ", x_train.shape)
print("y_train : ", y_train.shape)
print("x_test : ", x_test.shape)
print("y_test : ", y_test.shape)
ooo.plot_images(x_train, y_train, range(12), columns=6, x_size=2, y_size=2)
ooo.plot_images(x_train, y_train, range(36), columns=12, x_size=1, y_size=1)
```
%% Cell type:markdown id: tags:
## 4/ Create model
We will now build a model and train it...
Some models :
%% Cell type:code id: tags:
``` python
# A basic model
#
def get_model_v1(lx,ly,lz):
model = keras.models.Sequential()
model.add( keras.layers.Conv2D(96, (3,3), activation='relu', input_shape=(lx,ly,lz)))
model.add( keras.layers.MaxPooling2D((2, 2)))
model.add( keras.layers.Dropout(0.2))
model.add( keras.layers.Conv2D(192, (3, 3), activation='relu'))
model.add( keras.layers.MaxPooling2D((2, 2)))
model.add( keras.layers.Dropout(0.2))
model.add( keras.layers.Flatten())
model.add( keras.layers.Dense(1500, activation='relu'))
model.add( keras.layers.Dropout(0.5))
model.add( keras.layers.Dense(43, activation='softmax'))
return model
# A more sophisticated model
#
def get_model_v2(lx,ly,lz):
model = keras.models.Sequential()
model.add( keras.layers.Conv2D(64, (3, 3), padding='same', input_shape=(lx,ly,lz), activation='relu'))
model.add( keras.layers.Conv2D(64, (3, 3), activation='relu'))
model.add( keras.layers.MaxPooling2D(pool_size=(2, 2)))
model.add( keras.layers.Dropout(0.2))
model.add( keras.layers.Conv2D(128, (3, 3), padding='same', activation='relu'))
model.add( keras.layers.Conv2D(128, (3, 3), activation='relu'))
model.add( keras.layers.MaxPooling2D(pool_size=(2, 2)))
model.add( keras.layers.Dropout(0.2))
model.add( keras.layers.Conv2D(256, (3, 3), padding='same',activation='relu'))
model.add( keras.layers.Conv2D(256, (3, 3), activation='relu'))
model.add( keras.layers.MaxPooling2D(pool_size=(2, 2)))
model.add( keras.layers.Dropout(0.2))
model.add( keras.layers.Flatten())
model.add( keras.layers.Dense(512, activation='relu'))
model.add( keras.layers.Dropout(0.5))
model.add( keras.layers.Dense(43, activation='softmax'))
return model
# My sphisticated model, but small and fast
#
def get_model_v3(lx,ly,lz):
model = keras.models.Sequential()
model.add( keras.layers.Conv2D(32, (3,3), activation='relu', input_shape=(lx,ly,lz)))
model.add( keras.layers.MaxPooling2D((2, 2)))
model.add( keras.layers.Dropout(0.5))
model.add( keras.layers.Conv2D(64, (3, 3), activation='relu'))
model.add( keras.layers.MaxPooling2D((2, 2)))
model.add( keras.layers.Dropout(0.5))
model.add( keras.layers.Conv2D(128, (3, 3), activation='relu'))
model.add( keras.layers.MaxPooling2D((2, 2)))
model.add( keras.layers.Dropout(0.5))
model.add( keras.layers.Conv2D(256, (3, 3), activation='relu'))
model.add( keras.layers.MaxPooling2D((2, 2)))
model.add( keras.layers.Dropout(0.5))
model.add( keras.layers.Flatten())
model.add( keras.layers.Dense(1152, activation='relu'))
model.add( keras.layers.Dropout(0.5))
model.add( keras.layers.Dense(43, activation='softmax'))
return model
```
%% Cell type:markdown id: tags:
## 5/ Train the model
**Get the shape of my data :**
%% Cell type:code id: tags:
``` python
(n,lx,ly,lz) = x_train.shape
print("Images of the dataset have this folowing shape : ",(lx,ly,lz))
```
%% Cell type:markdown id: tags:
**Get and compile a model, with the data shape :**
%% Cell type:code id: tags:
``` python
model = get_model_v1(lx,ly,lz)
model.summary()
model.compile(optimizer = 'adam',
loss = 'sparse_categorical_crossentropy',
metrics = ['accuracy'])
```
%% Cell type:markdown id: tags:
**Train it :**
%% Cell type:code id: tags:
``` python
%%time
batch_size = 64
epochs = 5
# ---- Shuffle train data
x_train,y_train=ooo.shuffle_np_dataset(x_train,y_train)
# ---- Train
history = model.fit( x_train, y_train,
batch_size = batch_size,
epochs = epochs,
verbose = 1,
validation_data = (x_test, y_test))
```
%% Cell type:markdown id: tags:
**Evaluate it :**
%% Cell type:code id: tags:
``` python
max_val_accuracy = max(history.history["val_accuracy"])
print("Max validation accuracy is : {:.4f}".format(max_val_accuracy))
```
%% Cell type:code id: tags:
``` python
score = model.evaluate(x_test, y_test, verbose=0)
print('Test loss : {:5.4f}'.format(score[0]))
print('Test accuracy : {:5.4f}'.format(score[1]))
```
%% Cell type:markdown id: tags:
German Traffic Sign Recognition Benchmark (GTSRB)
=================================================
---
Introduction au Deep Learning (IDLE) - S. Arias, E. Maldonado, JL. Parouty - CNRS/SARI/DEVLOG - 2020
## Episode 3 : Tracking, visualizing and save models
Our main steps:
- Monitoring and understanding our model training
- Add recovery points
- Analyze the results
- Restore and run recovery pont
## 1/ Import and init
%% Cell type:code id: tags:
``` python
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras.callbacks import TensorBoard
import numpy as np
import h5py
from sklearn.metrics import confusion_matrix
import matplotlib.pyplot as plt
import seaborn as sn
import os, sys, time, random
from importlib import reload
sys.path.append('..')
import fidle.pwk as ooo
ooo.init()
```
%% Cell type:markdown id: tags:
## 2/ Load dataset
Dataset is one of the saved dataset: RGB25, RGB35, L25, L35, etc.
First of all, we're going to use a smart dataset : **set-24x24-L**
(with a GPU, it only takes 35'' compared to more than 5' with a CPU !)
%% Cell type:code id: tags:
``` python
%%time
def read_dataset(name):
'''Reads h5 dataset from ./data
Arguments: dataset name, without .h5
Returns: x_train,y_train,x_test,y_test data'''
# ---- Read dataset
filename='./data/'+name+'.h5'
with h5py.File(filename) as f:
x_train = f['x_train'][:]
y_train = f['y_train'][:]
x_test = f['x_test'][:]
y_test = f['y_test'][:]
x_meta = f['x_meta'][:]
y_meta = f['y_meta'][:]
# ---- done
print('Dataset "{}" is loaded. ({:.1f} Mo)\n'.format(name,os.path.getsize(filename)/(1024*1024)))
return x_train,y_train,x_test,y_test,x_meta,y_meta
x_train,y_train,x_test,y_test,x_meta,y_meta = read_dataset('set-24x24-L')
```
%% Cell type:markdown id: tags:
## 3/ Have a look to the dataset
Note: Data must be reshape for matplotlib
%% Cell type:code id: tags:
``` python
print("x_train : ", x_train.shape)
print("y_train : ", y_train.shape)
print("x_test : ", x_test.shape)
print("y_test : ", y_test.shape)
ooo.plot_images(x_train, y_train, range(12), columns=6, x_size=2, y_size=2)
ooo.plot_images(x_train, y_train, range(36), columns=12, x_size=1, y_size=1)
```
%% Cell type:markdown id: tags:
## 4/ Create model
We will now build a model and train it...
Some models...
%% Cell type:code id: tags:
``` python
# A basic model
#
def get_model_v1(lx,ly,lz):
model = keras.models.Sequential()
model.add( keras.layers.Conv2D(96, (3,3), activation='relu', input_shape=(lx,ly,lz)))
model.add( keras.layers.MaxPooling2D((2, 2)))
model.add( keras.layers.Dropout(0.2))
model.add( keras.layers.Conv2D(192, (3, 3), activation='relu'))
model.add( keras.layers.MaxPooling2D((2, 2)))
model.add( keras.layers.Dropout(0.2))
model.add( keras.layers.Flatten())
model.add( keras.layers.Dense(1500, activation='relu'))
model.add( keras.layers.Dropout(0.5))
model.add( keras.layers.Dense(43, activation='softmax'))
return model
```
%% Cell type:markdown id: tags:
## 5/ Prepare callbacks
We will add 2 callbacks :
- **TensorBoard**
Training logs, which can be visualised with Tensorboard.
`#tensorboard --logdir ./run/logs`
IMPORTANT : Relancer tensorboard à chaque run
- **Model backup**
It is possible to save the model each xx epoch or at each improvement.
The model can be saved completely or partially (weight).
For full format, we can use HDF5 format.
%% Cell type:raw id: tags:
%%bash
# To clean old logs and saved model, run this cell
#
/bin/rm -r ./run/logs 2>/dev/null
/bin/rm -r ./run/models 2>/dev/null
/bin/mkdir -p -m 755 ./run/logs
/bin/mkdir -p -m 755 ./run/models
echo -e "Reset directories : ./run/logs and ./run/models ."
%% Cell type:code id: tags:
``` python
ooo.mkdir('./run/models')
ooo.mkdir('./run/logs')
# ---- Callback tensorboard
log_dir = "./run/logs/tb_" + ooo.tag_now()
tensorboard_callback = tf.keras.callbacks.TensorBoard(log_dir=log_dir, histogram_freq=1)
# ---- Callback ModelCheckpoint - Save best model
save_dir = "./run/models/best-model.h5"
bestmodel_callback = tf.keras.callbacks.ModelCheckpoint(filepath=save_dir, verbose=0, monitor='accuracy', save_best_only=True)
# ---- Callback ModelCheckpoint - Save model each epochs
save_dir = "./run/models/model-{epoch:04d}.h5"
savemodel_callback = tf.keras.callbacks.ModelCheckpoint(filepath=save_dir, verbose=0, save_freq=2000*5)
```
%% Cell type:markdown id: tags:
## 5/ Train the model
**Get the shape of my data :**
%% Cell type:code id: tags:
``` python
(n,lx,ly,lz) = x_train.shape
print("Images of the dataset have this folowing shape : ",(lx,ly,lz))
```
%% Cell type:markdown id: tags:
**Get and compile a model, with the data shape :**
%% Cell type:code id: tags:
``` python
model = get_model_v1(lx,ly,lz)
# model.summary()
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
```
%% Cell type:markdown id: tags:
**Train it :**
Note: The training curve is visible in real time with Tensorboard :
`#tensorboard --logdir ./run/logs`
%% Cell type:code id: tags:
``` python
%%time
batch_size = 64
epochs = 30
# ---- Shuffle train data
x_train,y_train=ooo.shuffle_np_dataset(x_train,y_train)
# ---- Train
# Note: To be faster in our example, we can take only 2000 values
#
history = model.fit( x_train, y_train,
batch_size=batch_size,
epochs=epochs,
verbose=1,
validation_data=(x_test, y_test),
callbacks=[tensorboard_callback, bestmodel_callback, savemodel_callback] )
model.save('./run/models/last-model.h5')
```
%% Cell type:markdown id: tags:
**Evaluate it :**
%% Cell type:code id: tags:
``` python
max_val_accuracy = max(history.history["val_accuracy"])
print("Max validation accuracy is : {:.4f}".format(max_val_accuracy))
```
%% Cell type:code id: tags:
``` python
score = model.evaluate(x_test, y_test, verbose=0)
print('Test loss : {:5.4f}'.format(score[0]))
print('Test accuracy : {:5.4f}'.format(score[1]))
```
%% Cell type:markdown id: tags:
## 6/ History
The return of model.fit() returns us the learning history
%% Cell type:code id: tags:
``` python
ooo.plot_history(history)
```
%% Cell type:markdown id: tags:
## 7/ Evaluation and confusion
%% Cell type:code id: tags:
``` python
y_pred = model.predict_classes(x_test)
conf_mat = confusion_matrix(y_test,y_pred, normalize="true", labels=range(43))
ooo.plot_confusion_matrix(conf_mat)
```
%% Cell type:markdown id: tags:
## 8/ Restore and evaluate
### 8.1/ List saved models :
%% Cell type:code id: tags:
``` python
!find ./run/models/
```
%% Cell type:markdown id: tags:
### 8.2/ Restore a model :
%% Cell type:code id: tags:
``` python
loaded_model = tf.keras.models.load_model('./run/models/best-model.h5')
# loaded_model.summary()
print("Loaded.")
```
%% Cell type:markdown id: tags:
### 8.3/ Evaluate it :
%% Cell type:code id: tags:
``` python
score = loaded_model.evaluate(x_test, y_test, verbose=0)
print('Test loss : {:5.4f}'.format(score[0]))
print('Test accuracy : {:5.4f}'.format(score[1]))
```
%% Cell type:markdown id: tags:
### 8.4/ Make a prediction :
%% Cell type:code id: tags:
``` python
# ---- Get a random image
#
i = random.randint(1,len(x_test))
x,y = x_test[i], y_test[i]
# ---- Do prediction
#
predictions = loaded_model.predict( np.array([x]) )
# ---- A prediction is just the output layer
#
print("\nOutput layer from model is (x100) :\n")
with np.printoptions(precision=2, suppress=True, linewidth=95):
print(predictions*100)
# ---- Graphic visualisation
#
print("\nGraphically :\n")
plt.figure(figsize=(12,2))
plt.bar(range(43), predictions[0], align='center', alpha=0.5)
plt.ylabel('Probability')
plt.ylim((0,1))
plt.xlabel('Class')
plt.title('Trafic Sign prediction')
plt.show()
# ---- Predict class
#
p = np.argmax(predictions)
# ---- Show result
#
print("\nPrediction on the left, real stuff on the right :\n")
ooo.plot_images([x,x_meta[y]], [p,y], range(2), columns=3, x_size=3, y_size=2)
if p==y:
print("YEEES ! that's right!")
else:
print("oups, that's wrong ;-(")
```
%% Cell type:markdown id: tags:
---
That's all folks !
%% Cell type:code id: tags:
``` python
```
%% Cell type:markdown id: tags:
German Traffic Sign Recognition Benchmark (GTSRB)
=================================================
---
Introduction au Deep Learning (IDLE) - S. Arias, E. Maldonado, JL. Parouty - CNRS/SARI/DEVLOG - 2020
## Episode 4 : Data augmentation
Our main steps:
- Increase and improve the learning dataset
## 1/ Import and init
%% Cell type:code id: tags:
``` python
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras.callbacks import TensorBoard
import numpy as np
import h5py
from sklearn.metrics import confusion_matrix
import matplotlib.pyplot as plt
import seaborn as sn
import os, sys, time, random
from importlib import reload
sys.path.append('..')
import fidle.pwk as ooo
ooo.init()
```
%% Cell type:markdown id: tags:
## 2/ Dataset loader
Dataset is one of the saved dataset: RGB25, RGB35, L25, L35, etc.
First of all, we're going to use a smart dataset : **set-24x24-L**
(with a GPU, it only takes 35'' compared to more than 5' with a CPU !)
%% Cell type:code id: tags:
``` python
%%time
def read_dataset(name):
'''Reads h5 dataset from ./data
Arguments: dataset name, without .h5
Returns: x_train,y_train,x_test,y_test data'''
# ---- Read dataset
filename='./data/'+name+'.h5'
with h5py.File(filename) as f:
x_train = f['x_train'][:]
y_train = f['y_train'][:]
x_test = f['x_test'][:]
y_test = f['y_test'][:]
# ---- done
print('Dataset "{}" is loaded. ({:.1f} Mo)\n'.format(name,os.path.getsize(filename)/(1024*1024)))
return x_train,y_train,x_test,y_test
```
%% Cell type:markdown id: tags:
## 3/ Models
We will now build a model and train it...
This is my model ;-)
%% Cell type:code id: tags:
``` python
# A basic model
#
def get_model_v1(lx,ly,lz):
model = keras.models.Sequential()
model.add( keras.layers.Conv2D(96, (3,3), activation='relu', input_shape=(lx,ly,lz)))
model.add( keras.layers.MaxPooling2D((2, 2)))
model.add( keras.layers.Dropout(0.2))
model.add( keras.layers.Conv2D(192, (3, 3), activation='relu'))
model.add( keras.layers.MaxPooling2D((2, 2)))
model.add( keras.layers.Dropout(0.2))
model.add( keras.layers.Flatten())
model.add( keras.layers.Dense(1500, activation='relu'))
model.add( keras.layers.Dropout(0.5))
model.add( keras.layers.Dense(43, activation='softmax'))
return model
```
%% Cell type:markdown id: tags:
## 4/ Callbacks
We prepare 2 kind callbacks : TensorBoard and Model backup
%% Cell type:code id: tags:
``` python
%%bash
# To clean old logs and saved model, run this cell
#
/bin/rm -r ./run/logs 2>/dev/null
/bin/rm -r ./run/models 2>/dev/null
/bin/mkdir -p -m 755 ./run/logs
/bin/mkdir -p -m 755 ./run/models
echo -e "Reset directories : ./run/logs and ./run/models ."
```
%% Cell type:code id: tags:
``` python
# ---- Callback tensorboard
log_dir = "./run/logs/tb_" + ooo.tag_now()
tensorboard_callback = tf.keras.callbacks.TensorBoard(log_dir=log_dir, histogram_freq=1)
# ---- Callback ModelCheckpoint - Save best model
save_dir = "./run/models/best-model.h5"
bestmodel_callback = tf.keras.callbacks.ModelCheckpoint(filepath=save_dir, verbose=0, monitor='accuracy', save_best_only=True)
# ---- Callback ModelCheckpoint - Save model each epochs
save_dir = "./run/models/model-{epoch:04d}.h5"
savemodel_callback = tf.keras.callbacks.ModelCheckpoint(filepath=save_dir, verbose=0, save_freq=2000*5)
```
%% Cell type:markdown id: tags:
## 5/ Load and prepare dataset
### 5.1/ Load
%% Cell type:code id: tags:
``` python
x_train,y_train,x_test,y_test = read_dataset('set-48x48-L-LHE')
```
%% Cell type:markdown id: tags:
### 5.2/ Data augmentation
%% Cell type:code id: tags:
``` python
datagen = keras.preprocessing.image.ImageDataGenerator(featurewise_center=False,
featurewise_std_normalization=False,
width_shift_range=0.1,
height_shift_range=0.1,
zoom_range=0.2,
shear_range=0.1,
rotation_range=10.)
datagen.fit(x_train)
```
%% Cell type:markdown id: tags:
## 6/ Train the model
**Get the shape of my data :**
%% Cell type:code id: tags:
``` python
(n,lx,ly,lz) = x_train.shape
print("Images of the dataset have this folowing shape : ",(lx,ly,lz))
```
%% Cell type:markdown id: tags:
**Get and compile a model, with the data shape :**
%% Cell type:code id: tags:
``` python
model = get_model_v3(lx,ly,lz)
# model.summary()
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
```
%% Cell type:markdown id: tags:
**Train it :**
Note : La courbe d'apprentissage est visible en temps réel avec Tensorboard :
`#tensorboard --logdir ./run/logs`
%% Cell type:code id: tags:
``` python
%%time
batch_size = 64
epochs = 30
# ---- Shuffle train data
#x_train,y_train=ooo.shuffle_np_dataset(x_train,y_train)
# ---- Train
#
history = model.fit( datagen.flow(x_train, y_train, batch_size=batch_size),
steps_per_epoch = int(x_train.shape[0]/batch_size),
epochs=epochs,
verbose=1,
validation_data=(x_test, y_test),
callbacks=[tensorboard_callback, bestmodel_callback, savemodel_callback] )
model.save('./run/models/last-model.h5')
```
%% Cell type:markdown id: tags:
**Evaluate it :**
%% Cell type:code id: tags:
``` python
max_val_accuracy = max(history.history["val_accuracy"])
print("Max validation accuracy is : {:.4f}".format(max_val_accuracy))
```
%% Cell type:code id: tags:
``` python
score = model.evaluate(x_test, y_test, verbose=0)
print('Test loss : {:5.4f}'.format(score[0]))
print('Test accuracy : {:5.4f}'.format(score[1]))
```
%% Cell type:markdown id: tags:
## 7/ History
The return of model.fit() returns us the learning history
%% Cell type:code id: tags:
``` python
ooo.plot_history(history)
```
%% Cell type:markdown id: tags:
## 8/ Evaluate best model
%% Cell type:markdown id: tags:
### 8.1/ Restore best model :
%% Cell type:code id: tags:
``` python
loaded_model = tf.keras.models.load_model('./run/models/best-model.h5')
# best_model.summary()
print("Loaded.")
```
%% Cell type:markdown id: tags:
### 8.2/ Evaluate it :
%% Cell type:code id: tags:
``` python
score = loaded_model.evaluate(x_test, y_test, verbose=0)
print('Test loss : {:5.4f}'.format(score[0]))
print('Test accuracy : {:5.4f}'.format(score[1]))
```
%% Cell type:markdown id: tags:
**Plot confusion matrix**
%% Cell type:code id: tags:
``` python
y_pred = model.predict_classes(x_test)
conf_mat = confusion_matrix(y_test,y_pred, normalize="true", labels=range(43))
ooo.plot_confusion_matrix(conf_mat)
```
%% Cell type:markdown id: tags:
---
That's all folks !
%% Cell type:code id: tags:
``` python
```
%% Cell type:markdown id: tags:
German Traffic Sign Recognition Benchmark (GTSRB)
=================================================
---
Introduction au Deep Learning (IDLE) - S. Arias, E. Maldonado, JL. Parouty - CNRS/SARI/DEVLOG - 2020
## Episode 5 : Full Convolutions
Our main steps:
- Try n models with n datasets
- Save a Pandas/h5 report
- Write to be run in batch mode
## 1/ Import
%% Cell type:code id: tags:
``` python
import tensorflow as tf
from tensorflow import keras
import numpy as np
import h5py
import os,time,json
import random
from IPython.display import display
VERSION='1.6'
```
%% Cell type:markdown id: tags:
## 2/ Init and start
%% Cell type:code id: tags:
``` python
# ---- Where I am ?
now = time.strftime("%A %d %B %Y - %Hh%Mm%Ss")
here = os.getcwd()
random.seed(time.time())
tag_id = '{:06}'.format(random.randint(0,99999))
# ---- Who I am ?
if 'OAR_JOB_ID' in os.environ:
oar_id=os.environ['OAR_JOB_ID']
else:
oar_id='???'
print('\nFull Convolutions Notebook')
print(' Version : {}'.format(VERSION))
print(' Now is : {}'.format(now))
print(' OAR id : {}'.format(oar_id))
print(' Tag id : {}'.format(tag_id))
print(' Working directory : {}'.format(here))
print(' TensorFlow version :',tf.__version__)
print(' Keras version :',tf.keras.__version__)
print(' for tensorboard : --logdir {}/run/logs_{}'.format(here,tag_id))
```
%% Cell type:markdown id: tags:
## 3/ Dataset loading
%% Cell type:code id: tags:
``` python
def read_dataset(name):
'''Reads h5 dataset from ./data
Arguments: dataset name, without .h5
Returns: x_train,y_train,x_test,y_test data'''
# ---- Read dataset
filename='./data/'+name+'.h5'
with h5py.File(filename,'r') as f:
x_train = f['x_train'][:]
y_train = f['y_train'][:]
x_test = f['x_test'][:]
y_test = f['y_test'][:]
return x_train,y_train,x_test,y_test
```
%% Cell type:markdown id: tags:
## 4/ Models collection
%% Cell type:code id: tags:
``` python
# A basic model
#
def get_model_v1(lx,ly,lz):
model = keras.models.Sequential()
model.add( keras.layers.Conv2D(96, (3,3), activation='relu', input_shape=(lx,ly,lz)))
model.add( keras.layers.MaxPooling2D((2, 2)))
model.add( keras.layers.Dropout(0.2))
model.add( keras.layers.Conv2D(192, (3, 3), activation='relu'))
model.add( keras.layers.MaxPooling2D((2, 2)))
model.add( keras.layers.Dropout(0.2))
model.add( keras.layers.Flatten())
model.add( keras.layers.Dense(1500, activation='relu'))
model.add( keras.layers.Dropout(0.5))
model.add( keras.layers.Dense(43, activation='softmax'))
return model
# A more sophisticated model
#
def get_model_v2(lx,ly,lz):
model = keras.models.Sequential()
model.add( keras.layers.Conv2D(64, (3, 3), padding='same', input_shape=(lx,ly,lz), activation='relu'))
model.add( keras.layers.Conv2D(64, (3, 3), activation='relu'))
model.add( keras.layers.MaxPooling2D(pool_size=(2, 2)))
model.add( keras.layers.Dropout(0.2))
model.add( keras.layers.Conv2D(128, (3, 3), padding='same', activation='relu'))
model.add( keras.layers.Conv2D(128, (3, 3), activation='relu'))
model.add( keras.layers.MaxPooling2D(pool_size=(2, 2)))
model.add( keras.layers.Dropout(0.2))
model.add( keras.layers.Conv2D(256, (3, 3), padding='same',activation='relu'))
model.add( keras.layers.Conv2D(256, (3, 3), activation='relu'))
model.add( keras.layers.MaxPooling2D(pool_size=(2, 2)))
model.add( keras.layers.Dropout(0.2))
model.add( keras.layers.Flatten())
model.add( keras.layers.Dense(512, activation='relu'))
model.add( keras.layers.Dropout(0.5))
model.add( keras.layers.Dense(43, activation='softmax'))
return model
def get_model_v3(lx,ly,lz):
model = keras.models.Sequential()
model.add(tf.keras.layers.Conv2D(32, (5, 5), padding='same', activation='relu', input_shape=(lx,ly,lz)))
model.add(tf.keras.layers.BatchNormalization(axis=-1))
model.add(tf.keras.layers.MaxPooling2D(pool_size=(2, 2)))
model.add(tf.keras.layers.Dropout(0.2))
model.add(tf.keras.layers.Conv2D(64, (5, 5), padding='same', activation='relu'))
model.add(tf.keras.layers.BatchNormalization(axis=-1))
model.add(tf.keras.layers.Conv2D(128, (5, 5), padding='same', activation='relu'))
model.add(tf.keras.layers.BatchNormalization(axis=-1))
model.add(tf.keras.layers.MaxPooling2D(pool_size=(2, 2)))
model.add(tf.keras.layers.Dropout(0.2))
model.add(tf.keras.layers.Flatten())
model.add(tf.keras.layers.Dense(512, activation='relu'))
model.add(tf.keras.layers.BatchNormalization())
model.add(tf.keras.layers.Dropout(0.4))
model.add(tf.keras.layers.Dense(43, activation='softmax'))
return model
```
%% Cell type:markdown id: tags:
## 5/ Multiple datasets, multiple models ;-)
%% Cell type:code id: tags:
``` python
def multi_run(datasets, models, datagen=None,
train_size=1, test_size=1, batch_size=64, epochs=16,
verbose=0, extension_dir='last'):
# ---- Logs and models dir
#
os.makedirs('./run/logs_{}'.format(extension_dir), mode=0o750, exist_ok=True)
os.makedirs('./run/models_{}'.format(extension_dir), mode=0o750, exist_ok=True)
# ---- Columns of output
#
output={}
output['Dataset']=[]
output['Size'] =[]
for m in models:
output[m+'_Accuracy'] = []
output[m+'_Duration'] = []
# ---- Let's go
#
for d_name in datasets:
print("\nDataset : ",d_name)
# ---- Read dataset
x_train,y_train,x_test,y_test = read_dataset(d_name)
d_size=os.path.getsize('./data/'+d_name+'.h5')/(1024*1024)
output['Dataset'].append(d_name)
output['Size'].append(d_size)
# ---- Get the shape
(n,lx,ly,lz) = x_train.shape
n_train = int(x_train.shape[0]*train_size)
n_test = int(x_test.shape[0]*test_size)
# ---- For each model
for m_name,m_function in models.items():
print(" Run model {} : ".format(m_name), end='')
# ---- get model
try:
model=m_function(lx,ly,lz)
# ---- Compile it
model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
# ---- Callbacks tensorboard
log_dir = "./run/logs_{}/tb_{}_{}".format(extension_dir, d_name, m_name)
tensorboard_callback = tf.keras.callbacks.TensorBoard(log_dir=log_dir, histogram_freq=1)
# ---- Callbacks bestmodel
save_dir = "./run/models_{}/model_{}_{}.h5".format(extension_dir, d_name, m_name)
bestmodel_callback = tf.keras.callbacks.ModelCheckpoint(filepath=save_dir, verbose=0, monitor='accuracy', save_best_only=True)
# ---- Train
start_time = time.time()
if datagen==None:
# ---- No data augmentation (datagen=None) --------------------------------------
history = model.fit(x_train[:n_train], y_train[:n_train],
batch_size = batch_size,
epochs = epochs,
verbose = verbose,
validation_data = (x_test[:n_test], y_test[:n_test]),
callbacks = [tensorboard_callback, bestmodel_callback])
else:
# ---- Data augmentation (datagen given) ----------------------------------------
datagen.fit(x_train)
history = model.fit(datagen.flow(x_train, y_train, batch_size=batch_size),
steps_per_epoch = int(n_train/batch_size),
epochs = epochs,
verbose = verbose,
validation_data = (x_test[:n_test], y_test[:n_test]),
callbacks = [tensorboard_callback, bestmodel_callback])
# ---- Result
end_time = time.time()
duration = end_time-start_time
accuracy = max(history.history["val_accuracy"])*100
#
output[m_name+'_Accuracy'].append(accuracy)
output[m_name+'_Duration'].append(duration)
print("Accuracy={:.2f} and Duration={:.2f})".format(accuracy,duration))
except:
output[m_name+'_Accuracy'].append('0')
output[m_name+'_Duration'].append('999')
print('-')
return output
```
%% Cell type:markdown id: tags:
## 6/ Run !
%% Cell type:code id: tags:
``` python
start_time = time.time()
print('\n---- Run','-'*50)
# --------- Datasets, models, and more.. -----------------------------------
#
# ---- For tests
# datasets = ['set-24x24-L', 'set-24x24-RGB']
# models = {'v1':get_model_v1, 'v4':get_model_v2}
# batch_size = 64
# epochs = 2
# train_size = 0.1
# test_size = 0.1
# with_datagen = False
# verbose = 0
#
# ---- All possibilities -> Run A
# datasets = ['set-24x24-L', 'set-24x24-RGB', 'set-48x48-L', 'set-48x48-RGB', 'set-24x24-L-LHE', 'set-24x24-RGB-HE', 'set-48x48-L-LHE', 'set-48x48-RGB-HE']
# models = {'v1':get_model_v1, 'v2':get_model_v2, 'v3':get_model_v3}
# batch_size = 64
# epochs = 16
# train_size = 1
# test_size = 1
# with_datagen = False
# verbose = 0
#
# ---- Data augmentation -> Run B
datasets = ['set-48x48-RGB']
models = {'v2':get_model_v2}
batch_size = 64
epochs = 20
train_size = 1
test_size = 1
with_datagen = True
verbose = 0
#
# ---------------------------------------------------------------------------
# ---- Data augmentation
#
if with_datagen :
datagen = keras.preprocessing.image.ImageDataGenerator(featurewise_center=False,
featurewise_std_normalization=False,
width_shift_range=0.1,
height_shift_range=0.1,
zoom_range=0.2,
shear_range=0.1,
rotation_range=10.)
else:
datagen=None
# ---- Run
#
output = multi_run(datasets, models,
datagen=datagen,
train_size=train_size, test_size=test_size,
batch_size=batch_size, epochs=epochs,
verbose=verbose,
extension_dir=tag_id)
# ---- Save report
#
report={}
report['output']=output
report['description']='train_size={} test_size={} batch_size={} epochs={} data_aug={}'.format(train_size,test_size,batch_size,epochs,with_datagen)
report_name='./run/report_{}.json'.format(tag_id)
with open(report_name, 'w') as file:
json.dump(report, file)
print('\nReport saved as ',report_name)
end_time = time.time()
duration = end_time-start_time
print(f'Duration : {duration:.2f} s')
print('-'*59)
```
%% Cell type:markdown id: tags:
## 7/ That's all folks..
%% Cell type:code id: tags:
``` python
print('\n{}'.format(time.strftime("%A %-d %B %Y, %H:%M:%S")))
print("The work is done.\n")
```
%% Cell type:code id: tags:
``` python
```
%% Cell type:markdown id: tags:
German Traffic Sign Recognition Benchmark (GTSRB)
=================================================
---
Introduction au Deep Learning (IDLE) - S. Arias, E. Maldonado, JL. Parouty - CNRS/SARI/DEVLOG - 2020
## Episode 5.1 : Full Convolutions / run
Our main steps:
- Run Full-convolution.ipynb as a batch :
- Notebook mode
- Script mode
- Tensorboard follow up
## 1/ Run a notebook as a batch
To run a notebook :
```jupyter nbconvert --to notebook --execute <notebook>```
%% Cell type:raw id: tags:
%%bash
# ---- This will execute and save a notebook
#
jupyter nbconvert --ExecutePreprocessor.timeout=-1 --to notebook --output='./run/full_convolutions' --execute '05-Full-convolutions.ipynb'
%% Cell type:markdown id: tags:
## 2/ Export as a script (better choice)
To export a notebook as a script :
```jupyter nbconvert --to script <notebook>```
To run the script :
```ipython <script>```
%% Cell type:code id: tags:
``` python
%%bash
# ---- This will convert a notebook to a notebook.py script
#
jupyter nbconvert --to script --output='./run/full_convolutions_B' '05-Full-convolutions.ipynb'
```
%% Output
[NbConvertApp] Converting notebook 05-Full-convolutions.ipynb to script
[NbConvertApp] Writing 11305 bytes to ./run/full_convolutions_B.py
%% Cell type:code id: tags:
``` python
!ls -l ./run/*.py
```
%% Output
-rw-r--r-- 1 pjluc pjluc 11305 Jan 21 00:13 ./run/full_convolutions_B.py
%% Cell type:markdown id: tags:
## 3/ Batch submission
Create batch script :
%% Cell type:code id: tags:
``` python
%%writefile "./run/batch_full_convolutions_B.sh"
#!/bin/bash
#OAR -n Full convolutions
#OAR -t gpu
#OAR -l /nodes=1/gpudevice=1,walltime=01:00:00
#OAR --stdout _batch/full_convolutions_%jobid%.out
#OAR --stderr _batch/full_convolutions_%jobid%.err
#OAR --project deeplearningshs
#---- For cpu
# use :
# OAR -l /nodes=1/core=32,walltime=01:00:00
# and add a 2>/dev/null to ipython xxx
# ----------------------------------
# _ _ _
# | |__ __ _| |_ ___| |__
# | '_ \ / _` | __/ __| '_ \
# | |_) | (_| | || (__| | | |
# |_.__/ \__,_|\__\___|_| |_|
# Full convolutions
# ----------------------------------
#
CONDA_ENV=deeplearning2
RUN_DIR=~/fidle/GTSRB
RUN_SCRIPT=./run/full_convolutions_B.py
# ---- Cuda Conda initialization
#
echo '------------------------------------------------------------'
echo "Start : $0"
echo '------------------------------------------------------------'
#
source /applis/environments/cuda_env.sh dahu 10.0
source /applis/environments/conda.sh
#
conda activate "$CONDA_ENV"
# ---- Run it...
#
cd $RUN_DIR
ipython $RUN_SCRIPT
```
%% Output
Writing ./run/batch_full_convolutions_B.sh
%% Cell type:code id: tags:
``` python
%%bash
chmod 755 ./run/*.sh
chmod 755 ./run/*.py
ls -l ./run/*full_convolutions*
```
%% Output
-rwxr-xr-x 1 pjluc pjluc 1045 Jan 21 00:15 ./run/batch_full_convolutions_B.sh
-rwxr-xr-x 1 pjluc pjluc 611 Jan 19 15:53 ./run/batch_full_convolutions.sh
-rwxr-xr-x 1 pjluc pjluc 11305 Jan 21 00:13 ./run/full_convolutions_B.py
%% Cell type:raw id: tags:
%%bash
./run/batch_full_convolutions.sh
%% Cell type:code id: tags:
``` python
```
%% Cell type:markdown id: tags:
German Traffic Sign Recognition Benchmark (GTSRB)
=================================================
---
Introduction au Deep Learning (IDLE) - S. Arias, E. Maldonado, JL. Parouty - CNRS/SARI/DEVLOG - 2020
## Episode 5.2 : Full Convolutions Reports
Ou main steps :
- Show reports
## 1/ Import
%% Cell type:code id: tags:
``` python
import pandas as pd
import os,glob,json
from pathlib import Path
from IPython.display import display, Markdown
```
%% Cell type:markdown id: tags:
## 2/ A nice function
%% Cell type:code id: tags:
``` python
def highlight_max(s):
is_max = (s == s.max())
return ['background-color: yellow' if v else '' for v in is_max]
def show_report(file):
# ---- Read json file
with open(file) as infile:
dict_report = json.load( infile )
output = dict_report['output']
description = dict_report['description']
# ---- about
print("\n\n\nReport : ",Path(file).stem)
print( "Desc. : ",description,'\n')
# ---- Create a pandas
report = pd.DataFrame (output)
col_accuracy = [ c for c in output.keys() if c.endswith('Accuracy')]
col_duration = [ c for c in output.keys() if c.endswith('Duration')]
# ---- Build formats
lambda_acc = lambda x : '{:.2f} %'.format(x) if (isinstance(x, float)) else '{:}'.format(x)
lambda_dur = lambda x : '{:.1f} s'.format(x) if (isinstance(x, float)) else '{:}'.format(x)
formats = {'Size':'{:.2f} Mo'}
for c in col_accuracy:
formats[c]=lambda_acc
for c in col_duration:
formats[c]=lambda_dur
t=report.style.highlight_max(subset=col_accuracy).format(formats).hide_index()
display(t)
```
%% Cell type:markdown id: tags:
## 3/ Reports display
%% Cell type:code id: tags:
``` python
for file in glob.glob("./run/*.json"):
show_report(file)
```
%% Output
Report : report_009557
Desc. : train_size=1 test_size=1 batch_size=64 epochs=16 data_aug=False
Report : report_020341
Desc. : train_size=1 test_size=1 batch_size=64 epochs=16 data_aug=False
Report : report_041040
Desc. : train_size=1 test_size=1 batch_size=64 epochs=20 data_aug=True
Report : report_088809
Desc. : train_size=1 test_size=1 batch_size=64 epochs=20 data_aug=True
Report : report_093384
Desc. : train_size=1 test_size=1 batch_size=64 epochs=16 data_aug=False
Report : report_094801
Desc. : train_size=1 test_size=1 batch_size=64 epochs=20 data_aug=True
Report : report_2020_01_20_17h22m23s
Desc. : train_size=1 test_size=1 batch_size=64 epochs=16 data_aug=False
%% Cell type:code id: tags:
``` python
```
%% Cell type:code id: tags:
``` python
```
%% Cell type:markdown id: tags:
Running Tensorboard from Jupyter lab
====================================
---
Introduction au Deep Learning (IDLE) - S. Arias, E. Maldonado, JL. Parouty - CNRS/SARI/DEVLOG - 2020
Vesion : 1.0
%% Cell type:markdown id: tags:
## 1/ Méthode 1 : Shell command
%% Cell type:code id: tags:
``` python
%%bash
tensorboard_start --logdir ./run/logs
```
%% Cell type:code id: tags:
``` python
%%bash
tensorboard_status
```
%% Cell type:code id: tags:
``` python
%%bash
tensorboard_stop
```
%% Cell type:markdown id: tags:
## Méthode 2 : Magic command
**Start**
%% Cell type:code id: tags:
``` python
%load_ext tensorboard
```
%% Cell type:code id: tags:
``` python
%tensorboard --port 21277 --host 0.0.0.0 --logdir ./run/logs
```
%% Cell type:markdown id: tags:
**Stop**
No way... use bash method
## Methode 3 : Tensorboard module
**Start**
%% Cell type:code id: tags:
``` python
import tensorboard.notebook as tsb
```
%% Cell type:code id: tags:
``` python
tsb.start('--port 21277 --host 0.0.0.0 --logdir ./run/logs')
```
%% Cell type:markdown id: tags:
**Check**
%% Cell type:code id: tags:
``` python
a=tsb.list()
```
%% Output
No known TensorBoard instances running.
%% Cell type:markdown id: tags:
**Stop**
No way... use bash method
%% Cell type:code id: tags:
``` python
!kill 214798
```
%% Cell type:code id: tags:
``` python
```
source diff could not be displayed: it is too large. Options to address this: view the blob.
%% Cell type:markdown id: tags:
German Traffic Sign Recognition Benchmark (GTSRB)
=================================================
---
Introduction au Deep Learning (IDLE)
S. Aria, E. Maldonado, JL. Parouty
CNRS/SARI/DEVLOG - 2020
Objectives of this practical work
---------------------------------
Traffic sign classification with **CNN**, using Tensorflow and **Keras**
About the dataset
-----------------
Name : [German Traffic Sign Recognition Benchmark (GTSRB)](http://benchmark.ini.rub.de/?section=gtsrb)
Available [here](https://sid.erda.dk/public/archives/daaeac0d7ce1152aea9b61d9f1e19370/published-archive.html)
or on **[kaggle](https://www.kaggle.com/meowmeowmeowmeowmeow/gtsrb-german-traffic-sign)**
A nice example from : [Alex Staravoitau](https://navoshta.com/traffic-signs-classification/)
In few words :
- Images : Variable dimensions, rgb
- Train set : 39209 images
- Test set : 12630 images
- Classes : 0 to 42
Episodes
--------
**[01 - Preparation of data](01-Preparation-of-data.ipynb)**
- Understanding the dataset
- Preparing and formatting data
- Organize and backup data
**[02 - First convolutions](02-First-convolutions.ipynb)**
- Read dataset
- Build a model
- Train the model
- Model evaluation
%% Cell type:code id: tags:
``` python
```
%% Cell type:markdown id: tags:
Text Embedding - IMDB dataset
=============================
---
Formation Introduction au Deep Learning (FIDLE) - S. Arias, E. Maldonado, JL. Parouty - CNRS/SARI/DEVLOG - 2020
## Text classification using **Text embedding** :
The objective is to guess whether film reviews are **positive or negative** based on the analysis of the text.
Original dataset can be find **[there](http://ai.stanford.edu/~amaas/data/sentiment/)**
Note that [IMDb.com](https://imdb.com) offers several easy-to-use [datasets](https://www.imdb.com/interfaces/)
For simplicity's sake, we'll use the dataset directly [embedded in Keras](https://www.tensorflow.org/api_docs/python/tf/keras/datasets)
What we're going to do:
- Retrieve data
- Preparing the data
- Build a model
- Train the model
- Evaluate the result
%% Cell type:markdown id: tags:
## Step 1 - Init python stuff
%% Cell type:code id: tags:
``` python
import numpy as np
import tensorflow as tf
import tensorflow.keras as keras
import tensorflow.keras.datasets.imdb as imdb
import matplotlib.pyplot as plt
import matplotlib
# import seaborn as sns
import os,sys,h5py,json
from importlib import reload
sys.path.append('..')
import fidle.pwk as ooo
ooo.init()
```
%% Output
FIDLE 2020 - Practical Work Module
Version : 0.2.8
Run time : Friday 14 February 2020, 17:09:29
TensorFlow version : 2.0.0
Keras version : 2.2.4-tf
%% Cell type:markdown id: tags:
## Step 2 - Retrieve data
**From Keras :**
This IMDb dataset can bet get directly from [Keras datasets](https://www.tensorflow.org/api_docs/python/tf/keras/datasets)
Due to their nature, textual data can be somewhat complex.
### 2.1 - Data structure :
The dataset is composed of 2 parts: **reviews** and **opinions** (positive/negative), with a **dictionary**
- dataset = (reviews, opinions)
- reviews = \[ review_0, review_1, ...\]
- review_i = [ int1, int2, ...] where int_i is the index of the word in the dictionary.
- opinions = \[ int0, int1, ...\] where int_j == 0 if opinion is negative or 1 if opinion is positive.
- dictionary = \[ mot1:int1, mot2:int2, ... ]
%% Cell type:markdown id: tags:
### 2.2 - Get dataset
For simplicity, we will use a pre-formatted dataset.
See : https://www.tensorflow.org/api_docs/python/tf/keras/datasets/imdb/load_data
However, Keras offers some usefull tools for formatting textual data.
See : https://www.tensorflow.org/api_docs/python/tf/keras/preprocessing/text
%% Cell type:code id: tags:
``` python
vocab_size = 10000
# ----- Retrieve x,y
#
(x_train, y_train), (x_test, y_test) = imdb.load_data( num_words = vocab_size,
skip_top = 0,
maxlen = None,
seed = 42,
start_char = 1,
oov_char = 2,
index_from = 3, )
```
%% Cell type:code id: tags:
``` python
print(" Max(x_train,x_test) : ", ooo.rmax([x_train,x_test]) )
print(" x_train : {} y_train : {}".format(x_train.shape, y_train.shape))
print(" x_test : {} y_test : {}".format(x_test.shape, y_test.shape))
print('\nReview example (x_train[12]) :\n\n',x_train[12])
```
%% Output
Max(x_train,x_test) : 9999
x_train : (25000,) y_train : (25000,)
x_test : (25000,) y_test : (25000,)
Review example (x_train[12]) :
[1, 14, 22, 1367, 53, 206, 159, 4, 636, 898, 74, 26, 11, 436, 363, 108, 7, 14, 432, 14, 22, 9, 1055, 34, 8599, 2, 5, 381, 3705, 4509, 14, 768, 47, 839, 25, 111, 1517, 2579, 1991, 438, 2663, 587, 4, 280, 725, 6, 58, 11, 2714, 201, 4, 206, 16, 702, 5, 5176, 19, 480, 5920, 157, 13, 64, 219, 4, 2, 11, 107, 665, 1212, 39, 4, 206, 4, 65, 410, 16, 565, 5, 24, 43, 343, 17, 5602, 8, 169, 101, 85, 206, 108, 8, 3008, 14, 25, 215, 168, 18, 6, 2579, 1991, 438, 2, 11, 129, 1609, 36, 26, 66, 290, 3303, 46, 5, 633, 115, 4363]
%% Cell type:markdown id: tags:
### 2.3 - Have a look for humans (optional)
When we loaded the dataset, we asked for using \<start\> as 1, \<unknown word\> as 2
So, we shifted the dataset by 3 with the parameter index_from=3
%% Cell type:code id: tags:
``` python
# ---- Retrieve dictionary {word:index}, and encode it in ascii
word_index = imdb.get_word_index()
# ---- Shift the dictionary from +3
word_index = {w:(i+3) for w,i in word_index.items()}
# ---- Add <pad>, <start> and unknown tags
word_index.update( {'<pad>':0, '<start>':1, '<unknown>':2} )
# ---- Create a reverse dictionary : {index:word}
index_word = {index:word for word,index in word_index.items()}
# ---- Add a nice function to transpose :
#
def dataset2text(review):
return ' '.join([index_word.get(i, '?') for i in review])
```
%% Cell type:code id: tags:
``` python
print('\nDictionary size : ', len(word_index))
print('\nReview example (x_train[12]) :\n\n',x_train[12])
print('\nIn real words :\n\n', dataset2text(x_train[12]))
```
%% Output
Dictionary size : 88587
Review example (x_train[12]) :
[1, 14, 22, 1367, 53, 206, 159, 4, 636, 898, 74, 26, 11, 436, 363, 108, 7, 14, 432, 14, 22, 9, 1055, 34, 8599, 2, 5, 381, 3705, 4509, 14, 768, 47, 839, 25, 111, 1517, 2579, 1991, 438, 2663, 587, 4, 280, 725, 6, 58, 11, 2714, 201, 4, 206, 16, 702, 5, 5176, 19, 480, 5920, 157, 13, 64, 219, 4, 2, 11, 107, 665, 1212, 39, 4, 206, 4, 65, 410, 16, 565, 5, 24, 43, 343, 17, 5602, 8, 169, 101, 85, 206, 108, 8, 3008, 14, 25, 215, 168, 18, 6, 2579, 1991, 438, 2, 11, 129, 1609, 36, 26, 66, 290, 3303, 46, 5, 633, 115, 4363]
In real words :
<start> this film contains more action before the opening credits than are in entire hollywood films of this sort this film is produced by tsui <unknown> and stars jet li this team has brought you many worthy hong kong cinema productions including the once upon a time in china series the action was fast and furious with amazing wire work i only saw the <unknown> in two shots aside from the action the story itself was strong and not just used as filler to find any other action films to rival this you must look for a hong kong cinema <unknown> in your area they are really worth checking out and usually never disappoint
%% Cell type:markdown id: tags:
### 2.4 - Have a look for neurons
%% Cell type:code id: tags:
``` python
plt.figure(figsize=(12, 6))
ax=sns.distplot([len(i) for i in x_train],bins=60)
ax.set_title('Distribution of reviews by size')
plt.xlabel("Review's sizes")
plt.ylabel('Density')
ax.set_xlim(0, 1500)
plt.show()
```
%% Output
---------------------------------------------------------------------------
NameError Traceback (most recent call last)
<ipython-input-10-b7cb31218cd1> in <module>
1 plt.figure(figsize=(12, 6))
----> 2 ax=sns.distplot([len(i) for i in x_train],bins=60)
3 ax.set_title('Distribution of reviews by size')
4 plt.xlabel("Review's sizes")
5 plt.ylabel('Density')
NameError: name 'sns' is not defined
%% Cell type:markdown id: tags:
## Step 3 - Preprocess the data
In order to be processed by an NN, all entries must have the same length.
We chose a review length of **review_len**
We will therefore complete them with a padding (of \<pad\>\)
%% Cell type:code id: tags:
``` python
review_len = 256
x_train = keras.preprocessing.sequence.pad_sequences(x_train,
value = 0,
padding = 'post',
maxlen = review_len)
x_test = keras.preprocessing.sequence.pad_sequences(x_test,
value = 0 ,
padding = 'post',
maxlen = review_len)
print('\nReview example (x_train[12]) :\n\n',x_train[12])
print('\nIn real words :\n\n', dataset2text(x_train[12]))
```
%% Output
Review example (x_train[12]) :
[ 1 14 22 1367 53 206 159 4 636 898 74 26 11 436
363 108 7 14 432 14 22 9 1055 34 8599 2 5 381
3705 4509 14 768 47 839 25 111 1517 2579 1991 438 2663 587
4 280 725 6 58 11 2714 201 4 206 16 702 5 5176
19 480 5920 157 13 64 219 4 2 11 107 665 1212 39
4 206 4 65 410 16 565 5 24 43 343 17 5602 8
169 101 85 206 108 8 3008 14 25 215 168 18 6 2579
1991 438 2 11 129 1609 36 26 66 290 3303 46 5 633
115 4363 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0]
In real words :
<start> this film contains more action before the opening credits than are in entire hollywood films of this sort this film is produced by tsui <unknown> and stars jet li this team has brought you many worthy hong kong cinema productions including the once upon a time in china series the action was fast and furious with amazing wire work i only saw the <unknown> in two shots aside from the action the story itself was strong and not just used as filler to find any other action films to rival this you must look for a hong kong cinema <unknown> in your area they are really worth checking out and usually never disappoint <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad> <pad>
%% Cell type:markdown id: tags:
### Save dataset and dictionary (can be usefull)
%% Cell type:code id: tags:
``` python
os.makedirs('./data', mode=0o750, exist_ok=True)
with h5py.File('./data/dataset_imdb.h5', 'w') as f:
f.create_dataset("x_train", data=x_train)
f.create_dataset("y_train", data=y_train)
f.create_dataset("x_test", data=x_test)
f.create_dataset("y_test", data=y_test)
with open('./data/word_index.json', 'w') as fp:
json.dump(word_index, fp)
with open('./data/index_word.json', 'w') as fp:
json.dump(index_word, fp)
print('Saved.')
```
%% Output
Saved.
%% Cell type:markdown id: tags:
## Step 4 - Build the model
Few remarks :
1. We'll choose a dense vector size for the embedding output with **dense_vector_size**
2. **GlobalAveragePooling1D** do a pooling on the last dimension : (None, lx, ly) -> (None, ly)
In other words: we average the set of vectors/words of a sentence
3. L'embedding de Keras fonctionne de manière supervisée. Il s'agit d'une couche de *vocab_size* neurones vers *n_neurons* permettant de maintenir une table de vecteurs (les poids constituent les vecteurs). Cette couche ne calcule pas de sortie a la façon des couches normales, mais renvois la valeur des vecteurs. n mots => n vecteurs (ensuite empilés par le pooling)
Voir : https://stats.stackexchange.com/questions/324992/how-the-embedding-layer-is-trained-in-keras-embedding-layer
A SUIVRE : https://www.liip.ch/en/blog/sentiment-detection-with-keras-word-embeddings-and-lstm-deep-learning-networks
### 4.1 - Build
More documentation about :
- [Embedding](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Embedding)
- [GlobalAveragePooling1D](https://www.tensorflow.org/api_docs/python/tf/keras/layers/GlobalAveragePooling1D)
%% Cell type:code id: tags:
``` python
def get_model(dense_vector_size=32):
model = keras.Sequential()
model.add(keras.layers.Embedding(input_dim = vocab_size,
output_dim = dense_vector_size,
input_length = review_len))
model.add(keras.layers.GlobalAveragePooling1D())
model.add(keras.layers.Dense(dense_vector_size, activation='relu'))
model.add(keras.layers.Dense(1, activation='sigmoid'))
model.compile(optimizer = 'adam',
loss = 'binary_crossentropy',
metrics = ['accuracy'])
return model
```
%% Cell type:markdown id: tags:
## Step 5 - Train the model
### 5.1 - Get it
%% Cell type:code id: tags:
``` python
model = get_model(32)
model.summary()
```
%% Output
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
embedding (Embedding) (None, 256, 32) 320000
_________________________________________________________________
global_average_pooling1d (Gl (None, 32) 0
_________________________________________________________________
dense (Dense) (None, 32) 1056
_________________________________________________________________
dense_1 (Dense) (None, 1) 33
=================================================================
Total params: 321,089
Trainable params: 321,089
Non-trainable params: 0
_________________________________________________________________
%% Cell type:markdown id: tags:
### 5.2 - Add callback
%% Cell type:code id: tags:
``` python
os.makedirs('./run/models', mode=0o750, exist_ok=True)
save_dir = "./run/models/best_model.h5"
savemodel_callback = tf.keras.callbacks.ModelCheckpoint(filepath=save_dir, verbose=0, save_best_only=True)
```
%% Cell type:markdown id: tags:
### 5.1 - Train it
%% Cell type:code id: tags:
``` python
%%time
n_epochs = 30
batch_size = 512
history = model.fit(x_train,
y_train,
epochs = n_epochs,
batch_size = batch_size,
validation_data = (x_test, y_test),
verbose = 1,
callbacks = [savemodel_callback])
```
%% Output
Train on 25000 samples, validate on 25000 samples
Epoch 1/30
25000/25000 [==============================] - 2s 78us/sample - loss: 0.6901 - accuracy: 0.5984 - val_loss: 0.6838 - val_accuracy: 0.7333
Epoch 2/30
25000/25000 [==============================] - 1s 35us/sample - loss: 0.6646 - accuracy: 0.7538 - val_loss: 0.6378 - val_accuracy: 0.7683
Epoch 3/30
25000/25000 [==============================] - 1s 35us/sample - loss: 0.5878 - accuracy: 0.7973 - val_loss: 0.5433 - val_accuracy: 0.7966
Epoch 4/30
25000/25000 [==============================] - 1s 35us/sample - loss: 0.4778 - accuracy: 0.8386 - val_loss: 0.4481 - val_accuracy: 0.8300
Epoch 5/30
25000/25000 [==============================] - 1s 35us/sample - loss: 0.3860 - accuracy: 0.8691 - val_loss: 0.3815 - val_accuracy: 0.8550
Epoch 6/30
25000/25000 [==============================] - 1s 35us/sample - loss: 0.3264 - accuracy: 0.8856 - val_loss: 0.3439 - val_accuracy: 0.8643
Epoch 7/30
25000/25000 [==============================] - 1s 35us/sample - loss: 0.2883 - accuracy: 0.8956 - val_loss: 0.3223 - val_accuracy: 0.8695
Epoch 8/30
25000/25000 [==============================] - 1s 35us/sample - loss: 0.2602 - accuracy: 0.9052 - val_loss: 0.3083 - val_accuracy: 0.8748
Epoch 9/30
25000/25000 [==============================] - 1s 35us/sample - loss: 0.2388 - accuracy: 0.9122 - val_loss: 0.2981 - val_accuracy: 0.8770
Epoch 10/30
25000/25000 [==============================] - 1s 35us/sample - loss: 0.2201 - accuracy: 0.9197 - val_loss: 0.2928 - val_accuracy: 0.8791
Epoch 11/30
25000/25000 [==============================] - 1s 35us/sample - loss: 0.2050 - accuracy: 0.9261 - val_loss: 0.2898 - val_accuracy: 0.8809
Epoch 12/30
25000/25000 [==============================] - 1s 35us/sample - loss: 0.1917 - accuracy: 0.9308 - val_loss: 0.2872 - val_accuracy: 0.8823
Epoch 13/30
25000/25000 [==============================] - 1s 34us/sample - loss: 0.1799 - accuracy: 0.9356 - val_loss: 0.2903 - val_accuracy: 0.8799
Epoch 14/30
25000/25000 [==============================] - 1s 34us/sample - loss: 0.1699 - accuracy: 0.9405 - val_loss: 0.2890 - val_accuracy: 0.8824
Epoch 15/30
25000/25000 [==============================] - 1s 35us/sample - loss: 0.1599 - accuracy: 0.9455 - val_loss: 0.2925 - val_accuracy: 0.8820
Epoch 16/30
25000/25000 [==============================] - 1s 34us/sample - loss: 0.1517 - accuracy: 0.9484 - val_loss: 0.2953 - val_accuracy: 0.8816
Epoch 17/30
25000/25000 [==============================] - 1s 34us/sample - loss: 0.1436 - accuracy: 0.9517 - val_loss: 0.2996 - val_accuracy: 0.8817
Epoch 18/30
25000/25000 [==============================] - 1s 34us/sample - loss: 0.1366 - accuracy: 0.9546 - val_loss: 0.3053 - val_accuracy: 0.8790
Epoch 19/30
25000/25000 [==============================] - 1s 34us/sample - loss: 0.1299 - accuracy: 0.9575 - val_loss: 0.3104 - val_accuracy: 0.8798
Epoch 20/30
25000/25000 [==============================] - 1s 34us/sample - loss: 0.1246 - accuracy: 0.9589 - val_loss: 0.3181 - val_accuracy: 0.8756
Epoch 21/30
25000/25000 [==============================] - 1s 34us/sample - loss: 0.1180 - accuracy: 0.9617 - val_loss: 0.3233 - val_accuracy: 0.8760
Epoch 22/30
25000/25000 [==============================] - 1s 34us/sample - loss: 0.1131 - accuracy: 0.9644 - val_loss: 0.3305 - val_accuracy: 0.8750
Epoch 23/30
25000/25000 [==============================] - 1s 34us/sample - loss: 0.1075 - accuracy: 0.9668 - val_loss: 0.3379 - val_accuracy: 0.8745
Epoch 24/30
25000/25000 [==============================] - 1s 34us/sample - loss: 0.1029 - accuracy: 0.9684 - val_loss: 0.3462 - val_accuracy: 0.8723
Epoch 25/30
25000/25000 [==============================] - 1s 35us/sample - loss: 0.0976 - accuracy: 0.9711 - val_loss: 0.3564 - val_accuracy: 0.8726
Epoch 26/30
25000/25000 [==============================] - 1s 34us/sample - loss: 0.0940 - accuracy: 0.9728 - val_loss: 0.3627 - val_accuracy: 0.8716
Epoch 27/30
25000/25000 [==============================] - 1s 34us/sample - loss: 0.0893 - accuracy: 0.9750 - val_loss: 0.3719 - val_accuracy: 0.8695
Epoch 28/30
25000/25000 [==============================] - 1s 35us/sample - loss: 0.0852 - accuracy: 0.9763 - val_loss: 0.3814 - val_accuracy: 0.8692
Epoch 29/30
25000/25000 [==============================] - 1s 35us/sample - loss: 0.0816 - accuracy: 0.9778 - val_loss: 0.3910 - val_accuracy: 0.8682
Epoch 30/30
25000/25000 [==============================] - 1s 34us/sample - loss: 0.0783 - accuracy: 0.9795 - val_loss: 0.4004 - val_accuracy: 0.8673
CPU times: user 45.5 s, sys: 2.38 s, total: 47.8 s
Wall time: 27.2 s
%% Cell type:markdown id: tags:
## Step 6 - Evaluate
### 6.1 - Training history
%% Cell type:code id: tags:
``` python
ooo.plot_history(history)
```
%% Output
%% Cell type:markdown id: tags:
### 6.2 - Reload and evaluate best model
%% Cell type:code id: tags:
``` python
model = keras.models.load_model('./run/models/best_model.h5')
# ---- Evaluate
reload(ooo)
score = model.evaluate(x_test, y_test, verbose=0)
print('x_test / loss : {:5.4f}'.format(score[0]))
print('x_test / accuracy : {:5.4f}'.format(score[1]))
values=[score[1], 1-score[1]]
ooo.plot_donut(values,["Accuracy","Errors"], title="#### Accuracy donut is :")
# ---- Confusion matrix
y_pred = model.predict_classes(x_test)
# ooo.display_confusion_matrix(y_test,y_pred,labels=range(2),color='orange',font_size='20pt')
ooo.display_confusion_matrix(y_test,y_pred,labels=range(2))
```
%% Output
x_test / loss : 0.2872
x_test / accuracy : 0.8823
#### Accuracy donut is :
#### Confusion matrix is :
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-20-362e872d02fc> in <module>
16
17 # ooo.display_confusion_matrix(y_test,y_pred,labels=range(2),color='orange',font_size='20pt')
---> 18 ooo.display_confusion_matrix(y_test,y_pred,labels=range(2))
/gpfsdswork/projects/rech/mlh/uja62cb/fidle/fidle/pwk.py in display_confusion_matrix(y_true, y_pred, labels, color, font_size, title)
323 if title != None : display(Markdown(title))
324
--> 325 cm = confusion_matrix( y_true,y_pred, normalize="true", labels=labels)
326 df=pd.DataFrame(cm)
327
TypeError: confusion_matrix() got an unexpected keyword argument 'normalize'
%% Cell type:code id: tags:
``` python
```
%% Cell type:code id: tags:
``` python
```