Skip to content
Snippets Groups Projects
Commit ddd1efa8 authored by Jean-Luc Parouty's avatar Jean-Luc Parouty
Browse files

Update README.md

Former-commit-id: 7e8be825
parent 31744771
No related branches found
No related tags found
No related merge requests found
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
![Fidle](../fidle/img/00-Fidle-header-01.png)
[<img width="800px" src="../fidle/img/00-Fidle-header-01.svg"></img>](#)
<img width="800px" src="../fidle/img/00-Fidle-header-01.svg"></img> <img width="800px" src="../fidle/img/00-Fidle-header-01.svg"></img>
# <!-- TITLE --> Regression with a Dense Network (DNN) # <!-- TITLE --> Regression with a Dense Network (DNN)
<!-- DESC --> A Simple regression with a Dense Neural Network (DNN) - BHPD dataset <!-- DESC --> A Simple regression with a Dense Neural Network (DNN) - BHPD dataset
<!-- AUTHOR : Jean-Luc Parouty (CNRS/SIMaP) --> <!-- AUTHOR : Jean-Luc Parouty (CNRS/SIMaP) -->
## Objectives : ## Objectives :
- Predicts **housing prices** from a set of house features. - Predicts **housing prices** from a set of house features.
- Understanding the **principle** and the **architecture** of a regression with a **dense neural network** - Understanding the **principle** and the **architecture** of a regression with a **dense neural network**
The **[Boston Housing Dataset](https://www.cs.toronto.edu/~delve/data/boston/bostonDetail.html)** consists of price of houses in various places in Boston. The **[Boston Housing Dataset](https://www.cs.toronto.edu/~delve/data/boston/bostonDetail.html)** consists of price of houses in various places in Boston.
Alongside with price, the dataset also provide information such as Crime, areas of non-retail business in the town, Alongside with price, the dataset also provide information such as Crime, areas of non-retail business in the town,
age of people who own the house and many other attributes... age of people who own the house and many other attributes...
## What we're going to do : ## What we're going to do :
- Retrieve data - Retrieve data
- Preparing the data - Preparing the data
- Build a model - Build a model
- Train the model - Train the model
- Evaluate the result - Evaluate the result
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 1 - Import and init ## Step 1 - Import and init
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
import tensorflow as tf import tensorflow as tf
from tensorflow import keras from tensorflow import keras
import numpy as np import numpy as np
import matplotlib.pyplot as plt import matplotlib.pyplot as plt
import pandas as pd import pandas as pd
import os,sys import os,sys
from importlib import reload from importlib import reload
sys.path.append('..') sys.path.append('..')
import fidle.pwk as ooo import fidle.pwk as ooo
ooo.init() ooo.init()
``` ```
%% Output %% Output
FIDLE 2020 - Practical Work Module FIDLE 2020 - Practical Work Module
Version : 0.2.9 Version : 0.2.9
Run time : Wednesday 19 February 2020, 09:49:10 Run time : Wednesday 19 February 2020, 09:49:10
TensorFlow version : 2.0.0 TensorFlow version : 2.0.0
Keras version : 2.2.4-tf Keras version : 2.2.4-tf
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 2 - Retrieve data ## Step 2 - Retrieve data
### 2.1 - Option 1 : From Keras ### 2.1 - Option 1 : From Keras
Boston housing is a famous historic dataset, so we can get it directly from [Keras datasets](https://www.tensorflow.org/api_docs/python/tf/keras/datasets) Boston housing is a famous historic dataset, so we can get it directly from [Keras datasets](https://www.tensorflow.org/api_docs/python/tf/keras/datasets)
%% Cell type:raw id: tags: %% Cell type:raw id: tags:
(x_train, y_train), (x_test, y_test) = keras.datasets.boston_housing.load_data(test_split=0.2, seed=113) (x_train, y_train), (x_test, y_test) = keras.datasets.boston_housing.load_data(test_split=0.2, seed=113)
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
### 2.2 - Option 2 : From a csv file ### 2.2 - Option 2 : From a csv file
More fun ! More fun !
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
data = pd.read_csv('./data/BostonHousing.csv', header=0) data = pd.read_csv('./data/BostonHousing.csv', header=0)
display(data.head(5).style.format("{0:.2f}")) display(data.head(5).style.format("{0:.2f}"))
print('Données manquantes : ',data.isna().sum().sum(), ' Shape is : ', data.shape) print('Données manquantes : ',data.isna().sum().sum(), ' Shape is : ', data.shape)
``` ```
%% Output %% Output
Données manquantes : 0 Shape is : (506, 14) Données manquantes : 0 Shape is : (506, 14)
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 3 - Preparing the data ## Step 3 - Preparing the data
### 3.1 - Split data ### 3.1 - Split data
We will use 70% of the data for training and 30% for validation. We will use 70% of the data for training and 30% for validation.
x will be input data and y the expected output x will be input data and y the expected output
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
# ---- Split => train, test # ---- Split => train, test
# #
data_train = data.sample(frac=0.7, axis=0) data_train = data.sample(frac=0.7, axis=0)
data_test = data.drop(data_train.index) data_test = data.drop(data_train.index)
# ---- Split => x,y (medv is price) # ---- Split => x,y (medv is price)
# #
x_train = data_train.drop('medv', axis=1) x_train = data_train.drop('medv', axis=1)
y_train = data_train['medv'] y_train = data_train['medv']
x_test = data_test.drop('medv', axis=1) x_test = data_test.drop('medv', axis=1)
y_test = data_test['medv'] y_test = data_test['medv']
print('Original data shape was : ',data.shape) print('Original data shape was : ',data.shape)
print('x_train : ',x_train.shape, 'y_train : ',y_train.shape) print('x_train : ',x_train.shape, 'y_train : ',y_train.shape)
print('x_test : ',x_test.shape, 'y_test : ',y_test.shape) print('x_test : ',x_test.shape, 'y_test : ',y_test.shape)
``` ```
%% Output %% Output
Original data shape was : (506, 14) Original data shape was : (506, 14)
x_train : (354, 13) y_train : (354,) x_train : (354, 13) y_train : (354,)
x_test : (152, 13) y_test : (152,) x_test : (152, 13) y_test : (152,)
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
### 3.2 - Data normalization ### 3.2 - Data normalization
**Note :** **Note :**
- All input data must be normalized, train and test. - All input data must be normalized, train and test.
- To do this we will **subtract the mean** and **divide by the standard deviation**. - To do this we will **subtract the mean** and **divide by the standard deviation**.
- But test data should not be used in any way, even for normalization. - But test data should not be used in any way, even for normalization.
- The mean and the standard deviation will therefore only be calculated with the train data. - The mean and the standard deviation will therefore only be calculated with the train data.
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
display(x_train.describe().style.format("{0:.2f}").set_caption("Before normalization :")) display(x_train.describe().style.format("{0:.2f}").set_caption("Before normalization :"))
mean = x_train.mean() mean = x_train.mean()
std = x_train.std() std = x_train.std()
x_train = (x_train - mean) / std x_train = (x_train - mean) / std
x_test = (x_test - mean) / std x_test = (x_test - mean) / std
display(x_train.describe().style.format("{0:.2f}").set_caption("After normalization :")) display(x_train.describe().style.format("{0:.2f}").set_caption("After normalization :"))
x_train, y_train = np.array(x_train), np.array(y_train) x_train, y_train = np.array(x_train), np.array(y_train)
x_test, y_test = np.array(x_test), np.array(y_test) x_test, y_test = np.array(x_test), np.array(y_test)
``` ```
%% Output %% Output
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 4 - Build a model ## Step 4 - Build a model
About informations about : About informations about :
- [Optimizer](https://www.tensorflow.org/api_docs/python/tf/keras/optimizers) - [Optimizer](https://www.tensorflow.org/api_docs/python/tf/keras/optimizers)
- [Activation](https://www.tensorflow.org/api_docs/python/tf/keras/activations) - [Activation](https://www.tensorflow.org/api_docs/python/tf/keras/activations)
- [Loss](https://www.tensorflow.org/api_docs/python/tf/keras/losses) - [Loss](https://www.tensorflow.org/api_docs/python/tf/keras/losses)
- [Metrics](https://www.tensorflow.org/api_docs/python/tf/keras/metrics) - [Metrics](https://www.tensorflow.org/api_docs/python/tf/keras/metrics)
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
def get_model_v1(shape): def get_model_v1(shape):
model = keras.models.Sequential() model = keras.models.Sequential()
model.add(keras.layers.Input(shape, name="InputLayer")) model.add(keras.layers.Input(shape, name="InputLayer"))
model.add(keras.layers.Dense(64, activation='relu', name='Dense_n1')) model.add(keras.layers.Dense(64, activation='relu', name='Dense_n1'))
model.add(keras.layers.Dense(64, activation='relu', name='Dense_n2')) model.add(keras.layers.Dense(64, activation='relu', name='Dense_n2'))
model.add(keras.layers.Dense(1, name='Output')) model.add(keras.layers.Dense(1, name='Output'))
model.compile(optimizer = 'rmsprop', model.compile(optimizer = 'rmsprop',
loss = 'mse', loss = 'mse',
metrics = ['mae', 'mse'] ) metrics = ['mae', 'mse'] )
return model return model
``` ```
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 5 - Train the model ## Step 5 - Train the model
### 5.1 - Get it ### 5.1 - Get it
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
model=get_model_v1( (13,) ) model=get_model_v1( (13,) )
model.summary() model.summary()
keras.utils.plot_model( model, to_file='./run/model.png', show_shapes=True, show_layer_names=True, dpi=96) keras.utils.plot_model( model, to_file='./run/model.png', show_shapes=True, show_layer_names=True, dpi=96)
``` ```
%% Output %% Output
Model: "sequential" Model: "sequential"
_________________________________________________________________ _________________________________________________________________
Layer (type) Output Shape Param # Layer (type) Output Shape Param #
================================================================= =================================================================
Dense_n1 (Dense) (None, 64) 896 Dense_n1 (Dense) (None, 64) 896
_________________________________________________________________ _________________________________________________________________
Dense_n2 (Dense) (None, 64) 4160 Dense_n2 (Dense) (None, 64) 4160
_________________________________________________________________ _________________________________________________________________
Output (Dense) (None, 1) 65 Output (Dense) (None, 1) 65
================================================================= =================================================================
Total params: 5,121 Total params: 5,121
Trainable params: 5,121 Trainable params: 5,121
Non-trainable params: 0 Non-trainable params: 0
_________________________________________________________________ _________________________________________________________________
<IPython.core.display.Image object> <IPython.core.display.Image object>
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
### 5.2 - Train it ### 5.2 - Train it
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
history = model.fit(x_train, history = model.fit(x_train,
y_train, y_train,
epochs = 100, epochs = 100,
batch_size = 10, batch_size = 10,
verbose = 1, verbose = 1,
validation_data = (x_test, y_test)) validation_data = (x_test, y_test))
``` ```
%% Output %% Output
Train on 354 samples, validate on 152 samples Train on 354 samples, validate on 152 samples
Epoch 1/100 Epoch 1/100
354/354 [==============================] - 1s 2ms/sample - loss: 536.0845 - mae: 21.3335 - mse: 536.0846 - val_loss: 439.6562 - val_mae: 19.3198 - val_mse: 439.6562 354/354 [==============================] - 1s 2ms/sample - loss: 536.0845 - mae: 21.3335 - mse: 536.0846 - val_loss: 439.6562 - val_mae: 19.3198 - val_mse: 439.6562
Epoch 2/100 Epoch 2/100
354/354 [==============================] - 0s 216us/sample - loss: 354.0647 - mae: 16.8618 - mse: 354.0648 - val_loss: 231.3198 - val_mae: 13.5154 - val_mse: 231.3199 354/354 [==============================] - 0s 216us/sample - loss: 354.0647 - mae: 16.8618 - mse: 354.0648 - val_loss: 231.3198 - val_mae: 13.5154 - val_mse: 231.3199
Epoch 3/100 Epoch 3/100
354/354 [==============================] - 0s 194us/sample - loss: 155.7450 - mae: 9.9432 - mse: 155.7450 - val_loss: 69.8093 - val_mae: 6.2267 - val_mse: 69.8093 354/354 [==============================] - 0s 194us/sample - loss: 155.7450 - mae: 9.9432 - mse: 155.7450 - val_loss: 69.8093 - val_mae: 6.2267 - val_mse: 69.8093
Epoch 4/100 Epoch 4/100
354/354 [==============================] - 0s 170us/sample - loss: 55.4497 - mae: 5.2375 - mse: 55.4497 - val_loss: 28.5090 - val_mae: 4.0794 - val_mse: 28.5090 354/354 [==============================] - 0s 170us/sample - loss: 55.4497 - mae: 5.2375 - mse: 55.4497 - val_loss: 28.5090 - val_mae: 4.0794 - val_mse: 28.5090
Epoch 5/100 Epoch 5/100
354/354 [==============================] - 0s 172us/sample - loss: 31.6844 - mae: 4.0017 - mse: 31.6844 - val_loss: 21.9792 - val_mae: 3.3949 - val_mse: 21.9792 354/354 [==============================] - 0s 172us/sample - loss: 31.6844 - mae: 4.0017 - mse: 31.6844 - val_loss: 21.9792 - val_mae: 3.3949 - val_mse: 21.9792
Epoch 6/100 Epoch 6/100
354/354 [==============================] - 0s 175us/sample - loss: 24.5126 - mae: 3.4343 - mse: 24.5126 - val_loss: 18.8066 - val_mae: 3.1393 - val_mse: 18.8066 354/354 [==============================] - 0s 175us/sample - loss: 24.5126 - mae: 3.4343 - mse: 24.5126 - val_loss: 18.8066 - val_mae: 3.1393 - val_mse: 18.8066
Epoch 7/100 Epoch 7/100
354/354 [==============================] - 0s 176us/sample - loss: 21.5744 - mae: 3.2008 - mse: 21.5744 - val_loss: 16.6019 - val_mae: 3.0136 - val_mse: 16.6019 354/354 [==============================] - 0s 176us/sample - loss: 21.5744 - mae: 3.2008 - mse: 21.5744 - val_loss: 16.6019 - val_mae: 3.0136 - val_mse: 16.6019
Epoch 8/100 Epoch 8/100
354/354 [==============================] - 0s 174us/sample - loss: 19.6449 - mae: 3.0134 - mse: 19.6449 - val_loss: 15.8376 - val_mae: 2.9888 - val_mse: 15.8376 354/354 [==============================] - 0s 174us/sample - loss: 19.6449 - mae: 3.0134 - mse: 19.6449 - val_loss: 15.8376 - val_mae: 2.9888 - val_mse: 15.8376
Epoch 9/100 Epoch 9/100
354/354 [==============================] - 0s 170us/sample - loss: 18.6252 - mae: 2.9144 - mse: 18.6252 - val_loss: 15.3001 - val_mae: 2.9692 - val_mse: 15.3001 354/354 [==============================] - 0s 170us/sample - loss: 18.6252 - mae: 2.9144 - mse: 18.6252 - val_loss: 15.3001 - val_mae: 2.9692 - val_mse: 15.3001
Epoch 10/100 Epoch 10/100
354/354 [==============================] - 0s 173us/sample - loss: 17.0981 - mae: 2.7810 - mse: 17.0981 - val_loss: 14.8818 - val_mae: 2.9166 - val_mse: 14.8818 354/354 [==============================] - 0s 173us/sample - loss: 17.0981 - mae: 2.7810 - mse: 17.0981 - val_loss: 14.8818 - val_mae: 2.9166 - val_mse: 14.8818
Epoch 11/100 Epoch 11/100
354/354 [==============================] - 0s 169us/sample - loss: 16.0782 - mae: 2.6914 - mse: 16.0782 - val_loss: 14.3696 - val_mae: 2.8419 - val_mse: 14.3696 354/354 [==============================] - 0s 169us/sample - loss: 16.0782 - mae: 2.6914 - mse: 16.0782 - val_loss: 14.3696 - val_mae: 2.8419 - val_mse: 14.3696
Epoch 12/100 Epoch 12/100
354/354 [==============================] - 0s 174us/sample - loss: 15.5677 - mae: 2.6683 - mse: 15.5677 - val_loss: 13.9912 - val_mae: 2.8576 - val_mse: 13.9912 354/354 [==============================] - 0s 174us/sample - loss: 15.5677 - mae: 2.6683 - mse: 15.5677 - val_loss: 13.9912 - val_mae: 2.8576 - val_mse: 13.9912
Epoch 13/100 Epoch 13/100
354/354 [==============================] - 0s 185us/sample - loss: 14.8428 - mae: 2.5991 - mse: 14.8428 - val_loss: 14.3104 - val_mae: 2.8784 - val_mse: 14.3104 354/354 [==============================] - 0s 185us/sample - loss: 14.8428 - mae: 2.5991 - mse: 14.8428 - val_loss: 14.3104 - val_mae: 2.8784 - val_mse: 14.3104
Epoch 14/100 Epoch 14/100
354/354 [==============================] - 0s 174us/sample - loss: 14.3035 - mae: 2.5320 - mse: 14.3035 - val_loss: 13.7014 - val_mae: 2.7929 - val_mse: 13.7014 354/354 [==============================] - 0s 174us/sample - loss: 14.3035 - mae: 2.5320 - mse: 14.3035 - val_loss: 13.7014 - val_mae: 2.7929 - val_mse: 13.7014
Epoch 15/100 Epoch 15/100
354/354 [==============================] - 0s 174us/sample - loss: 13.6874 - mae: 2.4875 - mse: 13.6874 - val_loss: 13.2517 - val_mae: 2.7346 - val_mse: 13.2517 354/354 [==============================] - 0s 174us/sample - loss: 13.6874 - mae: 2.4875 - mse: 13.6874 - val_loss: 13.2517 - val_mae: 2.7346 - val_mse: 13.2517
Epoch 16/100 Epoch 16/100
354/354 [==============================] - 0s 169us/sample - loss: 13.3831 - mae: 2.4476 - mse: 13.3831 - val_loss: 13.0551 - val_mae: 2.7135 - val_mse: 13.0551 354/354 [==============================] - 0s 169us/sample - loss: 13.3831 - mae: 2.4476 - mse: 13.3831 - val_loss: 13.0551 - val_mae: 2.7135 - val_mse: 13.0551
Epoch 17/100 Epoch 17/100
354/354 [==============================] - 0s 173us/sample - loss: 13.1403 - mae: 2.4844 - mse: 13.1403 - val_loss: 13.0990 - val_mae: 2.6770 - val_mse: 13.0990 354/354 [==============================] - 0s 173us/sample - loss: 13.1403 - mae: 2.4844 - mse: 13.1403 - val_loss: 13.0990 - val_mae: 2.6770 - val_mse: 13.0990
Epoch 18/100 Epoch 18/100
354/354 [==============================] - 0s 167us/sample - loss: 12.7370 - mae: 2.3913 - mse: 12.7370 - val_loss: 12.6409 - val_mae: 2.6264 - val_mse: 12.6409 354/354 [==============================] - 0s 167us/sample - loss: 12.7370 - mae: 2.3913 - mse: 12.7370 - val_loss: 12.6409 - val_mae: 2.6264 - val_mse: 12.6409
Epoch 19/100 Epoch 19/100
354/354 [==============================] - 0s 175us/sample - loss: 12.3546 - mae: 2.3600 - mse: 12.3546 - val_loss: 12.5174 - val_mae: 2.7141 - val_mse: 12.5174 354/354 [==============================] - 0s 175us/sample - loss: 12.3546 - mae: 2.3600 - mse: 12.3546 - val_loss: 12.5174 - val_mae: 2.7141 - val_mse: 12.5174
Epoch 20/100 Epoch 20/100
354/354 [==============================] - 0s 166us/sample - loss: 12.1547 - mae: 2.3828 - mse: 12.1547 - val_loss: 12.1408 - val_mae: 2.6063 - val_mse: 12.1408 354/354 [==============================] - 0s 166us/sample - loss: 12.1547 - mae: 2.3828 - mse: 12.1547 - val_loss: 12.1408 - val_mae: 2.6063 - val_mse: 12.1408
Epoch 21/100 Epoch 21/100
354/354 [==============================] - 0s 179us/sample - loss: 11.8888 - mae: 2.3270 - mse: 11.8888 - val_loss: 11.9719 - val_mae: 2.5967 - val_mse: 11.9719 354/354 [==============================] - 0s 179us/sample - loss: 11.8888 - mae: 2.3270 - mse: 11.8888 - val_loss: 11.9719 - val_mae: 2.5967 - val_mse: 11.9719
Epoch 22/100 Epoch 22/100
354/354 [==============================] - 0s 189us/sample - loss: 11.6794 - mae: 2.3303 - mse: 11.6794 - val_loss: 11.8047 - val_mae: 2.5511 - val_mse: 11.8047 354/354 [==============================] - 0s 189us/sample - loss: 11.6794 - mae: 2.3303 - mse: 11.6794 - val_loss: 11.8047 - val_mae: 2.5511 - val_mse: 11.8047
Epoch 23/100 Epoch 23/100
354/354 [==============================] - 0s 170us/sample - loss: 11.3378 - mae: 2.3021 - mse: 11.3378 - val_loss: 12.4017 - val_mae: 2.6941 - val_mse: 12.4017 354/354 [==============================] - 0s 170us/sample - loss: 11.3378 - mae: 2.3021 - mse: 11.3378 - val_loss: 12.4017 - val_mae: 2.6941 - val_mse: 12.4017
Epoch 24/100 Epoch 24/100
354/354 [==============================] - 0s 186us/sample - loss: 10.9016 - mae: 2.3034 - mse: 10.9016 - val_loss: 12.3386 - val_mae: 2.5292 - val_mse: 12.3386 354/354 [==============================] - 0s 186us/sample - loss: 10.9016 - mae: 2.3034 - mse: 10.9016 - val_loss: 12.3386 - val_mae: 2.5292 - val_mse: 12.3386
Epoch 25/100 Epoch 25/100
354/354 [==============================] - 0s 202us/sample - loss: 10.7163 - mae: 2.3021 - mse: 10.7163 - val_loss: 12.2563 - val_mae: 2.5674 - val_mse: 12.2563 354/354 [==============================] - 0s 202us/sample - loss: 10.7163 - mae: 2.3021 - mse: 10.7163 - val_loss: 12.2563 - val_mae: 2.5674 - val_mse: 12.2563
Epoch 26/100 Epoch 26/100
354/354 [==============================] - 0s 192us/sample - loss: 10.8481 - mae: 2.2104 - mse: 10.8481 - val_loss: 11.2348 - val_mae: 2.4873 - val_mse: 11.2348 354/354 [==============================] - 0s 192us/sample - loss: 10.8481 - mae: 2.2104 - mse: 10.8481 - val_loss: 11.2348 - val_mae: 2.4873 - val_mse: 11.2348
Epoch 27/100 Epoch 27/100
354/354 [==============================] - 0s 192us/sample - loss: 10.7446 - mae: 2.2232 - mse: 10.7446 - val_loss: 11.4269 - val_mae: 2.5686 - val_mse: 11.4269 354/354 [==============================] - 0s 192us/sample - loss: 10.7446 - mae: 2.2232 - mse: 10.7446 - val_loss: 11.4269 - val_mae: 2.5686 - val_mse: 11.4269
Epoch 28/100 Epoch 28/100
354/354 [==============================] - 0s 187us/sample - loss: 10.1381 - mae: 2.1918 - mse: 10.1381 - val_loss: 13.4143 - val_mae: 2.6246 - val_mse: 13.4143 354/354 [==============================] - 0s 187us/sample - loss: 10.1381 - mae: 2.1918 - mse: 10.1381 - val_loss: 13.4143 - val_mae: 2.6246 - val_mse: 13.4143
Epoch 29/100 Epoch 29/100
354/354 [==============================] - 0s 176us/sample - loss: 10.5442 - mae: 2.1971 - mse: 10.5442 - val_loss: 11.4616 - val_mae: 2.4741 - val_mse: 11.4616 354/354 [==============================] - 0s 176us/sample - loss: 10.5442 - mae: 2.1971 - mse: 10.5442 - val_loss: 11.4616 - val_mae: 2.4741 - val_mse: 11.4616
Epoch 30/100 Epoch 30/100
354/354 [==============================] - 0s 218us/sample - loss: 10.2099 - mae: 2.1867 - mse: 10.2099 - val_loss: 11.4631 - val_mae: 2.4684 - val_mse: 11.4631 354/354 [==============================] - 0s 218us/sample - loss: 10.2099 - mae: 2.1867 - mse: 10.2099 - val_loss: 11.4631 - val_mae: 2.4684 - val_mse: 11.4631
Epoch 31/100 Epoch 31/100
354/354 [==============================] - 0s 202us/sample - loss: 9.5920 - mae: 2.1342 - mse: 9.5920 - val_loss: 12.5109 - val_mae: 2.6033 - val_mse: 12.5109 354/354 [==============================] - 0s 202us/sample - loss: 9.5920 - mae: 2.1342 - mse: 9.5920 - val_loss: 12.5109 - val_mae: 2.6033 - val_mse: 12.5109
Epoch 32/100 Epoch 32/100
354/354 [==============================] - 0s 179us/sample - loss: 9.9940 - mae: 2.1424 - mse: 9.9940 - val_loss: 11.1528 - val_mae: 2.4392 - val_mse: 11.1528 354/354 [==============================] - 0s 179us/sample - loss: 9.9940 - mae: 2.1424 - mse: 9.9940 - val_loss: 11.1528 - val_mae: 2.4392 - val_mse: 11.1528
Epoch 33/100 Epoch 33/100
354/354 [==============================] - 0s 197us/sample - loss: 9.5950 - mae: 2.1156 - mse: 9.5950 - val_loss: 12.0327 - val_mae: 2.6225 - val_mse: 12.0327 354/354 [==============================] - 0s 197us/sample - loss: 9.5950 - mae: 2.1156 - mse: 9.5950 - val_loss: 12.0327 - val_mae: 2.6225 - val_mse: 12.0327
Epoch 34/100 Epoch 34/100
354/354 [==============================] - 0s 228us/sample - loss: 9.6256 - mae: 2.0962 - mse: 9.6256 - val_loss: 10.8296 - val_mae: 2.4168 - val_mse: 10.8296 354/354 [==============================] - 0s 228us/sample - loss: 9.6256 - mae: 2.0962 - mse: 9.6256 - val_loss: 10.8296 - val_mae: 2.4168 - val_mse: 10.8296
Epoch 35/100 Epoch 35/100
354/354 [==============================] - 0s 179us/sample - loss: 9.3365 - mae: 2.1271 - mse: 9.3365 - val_loss: 10.7088 - val_mae: 2.5094 - val_mse: 10.7088 354/354 [==============================] - 0s 179us/sample - loss: 9.3365 - mae: 2.1271 - mse: 9.3365 - val_loss: 10.7088 - val_mae: 2.5094 - val_mse: 10.7088
Epoch 36/100 Epoch 36/100
354/354 [==============================] - 0s 184us/sample - loss: 9.2796 - mae: 2.0914 - mse: 9.2796 - val_loss: 10.7439 - val_mae: 2.4282 - val_mse: 10.7439 354/354 [==============================] - 0s 184us/sample - loss: 9.2796 - mae: 2.0914 - mse: 9.2796 - val_loss: 10.7439 - val_mae: 2.4282 - val_mse: 10.7439
Epoch 37/100 Epoch 37/100
354/354 [==============================] - 0s 186us/sample - loss: 8.7178 - mae: 2.0390 - mse: 8.7178 - val_loss: 13.1923 - val_mae: 2.5942 - val_mse: 13.1923 354/354 [==============================] - 0s 186us/sample - loss: 8.7178 - mae: 2.0390 - mse: 8.7178 - val_loss: 13.1923 - val_mae: 2.5942 - val_mse: 13.1923
Epoch 38/100 Epoch 38/100
354/354 [==============================] - 0s 202us/sample - loss: 8.8195 - mae: 2.0927 - mse: 8.8195 - val_loss: 10.9034 - val_mae: 2.5152 - val_mse: 10.9034 354/354 [==============================] - 0s 202us/sample - loss: 8.8195 - mae: 2.0927 - mse: 8.8195 - val_loss: 10.9034 - val_mae: 2.5152 - val_mse: 10.9034
Epoch 39/100 Epoch 39/100
354/354 [==============================] - 0s 190us/sample - loss: 8.9152 - mae: 2.0784 - mse: 8.9152 - val_loss: 11.3023 - val_mae: 2.4404 - val_mse: 11.3023 354/354 [==============================] - 0s 190us/sample - loss: 8.9152 - mae: 2.0784 - mse: 8.9152 - val_loss: 11.3023 - val_mae: 2.4404 - val_mse: 11.3023
Epoch 40/100 Epoch 40/100
354/354 [==============================] - 0s 196us/sample - loss: 8.8418 - mae: 2.0187 - mse: 8.8418 - val_loss: 10.7721 - val_mae: 2.5067 - val_mse: 10.7721 354/354 [==============================] - 0s 196us/sample - loss: 8.8418 - mae: 2.0187 - mse: 8.8418 - val_loss: 10.7721 - val_mae: 2.5067 - val_mse: 10.7721
Epoch 41/100 Epoch 41/100
354/354 [==============================] - 0s 181us/sample - loss: 8.6890 - mae: 2.0260 - mse: 8.6890 - val_loss: 11.0856 - val_mae: 2.5693 - val_mse: 11.0856 354/354 [==============================] - 0s 181us/sample - loss: 8.6890 - mae: 2.0260 - mse: 8.6890 - val_loss: 11.0856 - val_mae: 2.5693 - val_mse: 11.0856
Epoch 42/100 Epoch 42/100
354/354 [==============================] - 0s 174us/sample - loss: 8.4768 - mae: 2.0517 - mse: 8.4768 - val_loss: 11.3269 - val_mae: 2.4414 - val_mse: 11.3269 354/354 [==============================] - 0s 174us/sample - loss: 8.4768 - mae: 2.0517 - mse: 8.4768 - val_loss: 11.3269 - val_mae: 2.4414 - val_mse: 11.3269
Epoch 43/100 Epoch 43/100
354/354 [==============================] - 0s 171us/sample - loss: 8.5229 - mae: 1.9943 - mse: 8.5229 - val_loss: 10.4669 - val_mae: 2.4794 - val_mse: 10.4669 354/354 [==============================] - 0s 171us/sample - loss: 8.5229 - mae: 1.9943 - mse: 8.5229 - val_loss: 10.4669 - val_mae: 2.4794 - val_mse: 10.4669
Epoch 44/100 Epoch 44/100
354/354 [==============================] - 0s 172us/sample - loss: 8.0707 - mae: 1.9900 - mse: 8.0707 - val_loss: 11.6943 - val_mae: 2.5034 - val_mse: 11.6943 354/354 [==============================] - 0s 172us/sample - loss: 8.0707 - mae: 1.9900 - mse: 8.0707 - val_loss: 11.6943 - val_mae: 2.5034 - val_mse: 11.6943
Epoch 45/100 Epoch 45/100
354/354 [==============================] - 0s 172us/sample - loss: 8.1752 - mae: 1.9715 - mse: 8.1752 - val_loss: 10.6043 - val_mae: 2.3636 - val_mse: 10.6043 354/354 [==============================] - 0s 172us/sample - loss: 8.1752 - mae: 1.9715 - mse: 8.1752 - val_loss: 10.6043 - val_mae: 2.3636 - val_mse: 10.6043
Epoch 46/100 Epoch 46/100
354/354 [==============================] - 0s 174us/sample - loss: 8.2037 - mae: 1.9739 - mse: 8.2037 - val_loss: 10.5447 - val_mae: 2.3784 - val_mse: 10.5447 354/354 [==============================] - 0s 174us/sample - loss: 8.2037 - mae: 1.9739 - mse: 8.2037 - val_loss: 10.5447 - val_mae: 2.3784 - val_mse: 10.5447
Epoch 47/100 Epoch 47/100
354/354 [==============================] - 0s 173us/sample - loss: 7.9866 - mae: 1.9744 - mse: 7.9866 - val_loss: 10.6746 - val_mae: 2.4501 - val_mse: 10.6746 354/354 [==============================] - 0s 173us/sample - loss: 7.9866 - mae: 1.9744 - mse: 7.9866 - val_loss: 10.6746 - val_mae: 2.4501 - val_mse: 10.6746
Epoch 48/100 Epoch 48/100
354/354 [==============================] - 0s 165us/sample - loss: 7.7703 - mae: 1.9705 - mse: 7.7703 - val_loss: 10.4041 - val_mae: 2.4620 - val_mse: 10.4041 354/354 [==============================] - 0s 165us/sample - loss: 7.7703 - mae: 1.9705 - mse: 7.7703 - val_loss: 10.4041 - val_mae: 2.4620 - val_mse: 10.4041
Epoch 49/100 Epoch 49/100
354/354 [==============================] - 0s 182us/sample - loss: 7.8774 - mae: 1.9809 - mse: 7.8774 - val_loss: 10.6823 - val_mae: 2.4969 - val_mse: 10.6823 354/354 [==============================] - 0s 182us/sample - loss: 7.8774 - mae: 1.9809 - mse: 7.8774 - val_loss: 10.6823 - val_mae: 2.4969 - val_mse: 10.6823
Epoch 50/100 Epoch 50/100
354/354 [==============================] - 0s 167us/sample - loss: 7.8654 - mae: 1.9666 - mse: 7.8654 - val_loss: 10.6351 - val_mae: 2.4191 - val_mse: 10.6351 354/354 [==============================] - 0s 167us/sample - loss: 7.8654 - mae: 1.9666 - mse: 7.8654 - val_loss: 10.6351 - val_mae: 2.4191 - val_mse: 10.6351
Epoch 51/100 Epoch 51/100
354/354 [==============================] - 0s 180us/sample - loss: 7.6560 - mae: 1.9236 - mse: 7.6560 - val_loss: 10.3918 - val_mae: 2.3943 - val_mse: 10.3918 354/354 [==============================] - 0s 180us/sample - loss: 7.6560 - mae: 1.9236 - mse: 7.6560 - val_loss: 10.3918 - val_mae: 2.3943 - val_mse: 10.3918
Epoch 52/100 Epoch 52/100
354/354 [==============================] - 0s 170us/sample - loss: 7.3560 - mae: 1.8763 - mse: 7.3560 - val_loss: 10.3560 - val_mae: 2.5009 - val_mse: 10.3560 354/354 [==============================] - 0s 170us/sample - loss: 7.3560 - mae: 1.8763 - mse: 7.3560 - val_loss: 10.3560 - val_mae: 2.5009 - val_mse: 10.3560
Epoch 53/100 Epoch 53/100
354/354 [==============================] - 0s 163us/sample - loss: 7.5076 - mae: 1.8973 - mse: 7.5076 - val_loss: 10.5798 - val_mae: 2.4698 - val_mse: 10.5798 354/354 [==============================] - 0s 163us/sample - loss: 7.5076 - mae: 1.8973 - mse: 7.5076 - val_loss: 10.5798 - val_mae: 2.4698 - val_mse: 10.5798
Epoch 54/100 Epoch 54/100
354/354 [==============================] - 0s 164us/sample - loss: 7.4315 - mae: 1.8962 - mse: 7.4315 - val_loss: 10.0018 - val_mae: 2.3756 - val_mse: 10.0018 354/354 [==============================] - 0s 164us/sample - loss: 7.4315 - mae: 1.8962 - mse: 7.4315 - val_loss: 10.0018 - val_mae: 2.3756 - val_mse: 10.0018
Epoch 55/100 Epoch 55/100
354/354 [==============================] - 0s 170us/sample - loss: 7.2476 - mae: 1.9127 - mse: 7.2476 - val_loss: 10.0664 - val_mae: 2.4074 - val_mse: 10.0664 354/354 [==============================] - 0s 170us/sample - loss: 7.2476 - mae: 1.9127 - mse: 7.2476 - val_loss: 10.0664 - val_mae: 2.4074 - val_mse: 10.0664
Epoch 56/100 Epoch 56/100
354/354 [==============================] - 0s 168us/sample - loss: 7.1336 - mae: 1.8297 - mse: 7.1336 - val_loss: 10.5519 - val_mae: 2.4670 - val_mse: 10.5519 354/354 [==============================] - 0s 168us/sample - loss: 7.1336 - mae: 1.8297 - mse: 7.1336 - val_loss: 10.5519 - val_mae: 2.4670 - val_mse: 10.5519
Epoch 57/100 Epoch 57/100
354/354 [==============================] - 0s 177us/sample - loss: 7.0707 - mae: 1.8462 - mse: 7.0707 - val_loss: 11.4684 - val_mae: 2.7035 - val_mse: 11.4684 354/354 [==============================] - 0s 177us/sample - loss: 7.0707 - mae: 1.8462 - mse: 7.0707 - val_loss: 11.4684 - val_mae: 2.7035 - val_mse: 11.4684
Epoch 58/100 Epoch 58/100
354/354 [==============================] - 0s 173us/sample - loss: 6.9632 - mae: 1.8780 - mse: 6.9632 - val_loss: 10.6361 - val_mae: 2.4145 - val_mse: 10.6361 354/354 [==============================] - 0s 173us/sample - loss: 6.9632 - mae: 1.8780 - mse: 6.9632 - val_loss: 10.6361 - val_mae: 2.4145 - val_mse: 10.6361
Epoch 59/100 Epoch 59/100
354/354 [==============================] - 0s 208us/sample - loss: 7.1218 - mae: 1.8522 - mse: 7.1218 - val_loss: 10.3080 - val_mae: 2.3628 - val_mse: 10.3080 354/354 [==============================] - 0s 208us/sample - loss: 7.1218 - mae: 1.8522 - mse: 7.1218 - val_loss: 10.3080 - val_mae: 2.3628 - val_mse: 10.3080
Epoch 60/100 Epoch 60/100
354/354 [==============================] - 0s 261us/sample - loss: 6.7623 - mae: 1.7823 - mse: 6.7623 - val_loss: 10.3923 - val_mae: 2.3174 - val_mse: 10.3923 354/354 [==============================] - 0s 261us/sample - loss: 6.7623 - mae: 1.7823 - mse: 6.7623 - val_loss: 10.3923 - val_mae: 2.3174 - val_mse: 10.3923
Epoch 61/100 Epoch 61/100
354/354 [==============================] - 0s 166us/sample - loss: 6.9012 - mae: 1.8504 - mse: 6.9012 - val_loss: 10.1488 - val_mae: 2.3802 - val_mse: 10.1488 354/354 [==============================] - 0s 166us/sample - loss: 6.9012 - mae: 1.8504 - mse: 6.9012 - val_loss: 10.1488 - val_mae: 2.3802 - val_mse: 10.1488
Epoch 62/100 Epoch 62/100
354/354 [==============================] - 0s 171us/sample - loss: 6.6419 - mae: 1.8210 - mse: 6.6419 - val_loss: 10.7578 - val_mae: 2.5222 - val_mse: 10.7578 354/354 [==============================] - 0s 171us/sample - loss: 6.6419 - mae: 1.8210 - mse: 6.6419 - val_loss: 10.7578 - val_mae: 2.5222 - val_mse: 10.7578
Epoch 63/100 Epoch 63/100
354/354 [==============================] - 0s 181us/sample - loss: 6.5397 - mae: 1.8096 - mse: 6.5397 - val_loss: 10.5892 - val_mae: 2.5217 - val_mse: 10.5892 354/354 [==============================] - 0s 181us/sample - loss: 6.5397 - mae: 1.8096 - mse: 6.5397 - val_loss: 10.5892 - val_mae: 2.5217 - val_mse: 10.5892
Epoch 64/100 Epoch 64/100
354/354 [==============================] - 0s 171us/sample - loss: 6.4273 - mae: 1.7990 - mse: 6.4273 - val_loss: 10.7066 - val_mae: 2.4491 - val_mse: 10.7066 354/354 [==============================] - 0s 171us/sample - loss: 6.4273 - mae: 1.7990 - mse: 6.4273 - val_loss: 10.7066 - val_mae: 2.4491 - val_mse: 10.7066
Epoch 65/100 Epoch 65/100
354/354 [==============================] - 0s 164us/sample - loss: 6.2635 - mae: 1.7888 - mse: 6.2635 - val_loss: 10.2444 - val_mae: 2.4960 - val_mse: 10.2444 354/354 [==============================] - 0s 164us/sample - loss: 6.2635 - mae: 1.7888 - mse: 6.2635 - val_loss: 10.2444 - val_mae: 2.4960 - val_mse: 10.2444
Epoch 66/100 Epoch 66/100
354/354 [==============================] - 0s 173us/sample - loss: 6.3313 - mae: 1.7769 - mse: 6.3313 - val_loss: 10.1284 - val_mae: 2.3855 - val_mse: 10.1284 354/354 [==============================] - 0s 173us/sample - loss: 6.3313 - mae: 1.7769 - mse: 6.3313 - val_loss: 10.1284 - val_mae: 2.3855 - val_mse: 10.1284
Epoch 67/100 Epoch 67/100
354/354 [==============================] - 0s 169us/sample - loss: 6.2141 - mae: 1.7620 - mse: 6.2141 - val_loss: 10.3170 - val_mae: 2.4570 - val_mse: 10.3170 354/354 [==============================] - 0s 169us/sample - loss: 6.2141 - mae: 1.7620 - mse: 6.2141 - val_loss: 10.3170 - val_mae: 2.4570 - val_mse: 10.3170
Epoch 68/100 Epoch 68/100
354/354 [==============================] - 0s 183us/sample - loss: 6.1732 - mae: 1.7589 - mse: 6.1732 - val_loss: 9.7494 - val_mae: 2.3912 - val_mse: 9.7494 354/354 [==============================] - 0s 183us/sample - loss: 6.1732 - mae: 1.7589 - mse: 6.1732 - val_loss: 9.7494 - val_mae: 2.3912 - val_mse: 9.7494
Epoch 69/100 Epoch 69/100
354/354 [==============================] - 0s 173us/sample - loss: 6.1812 - mae: 1.7704 - mse: 6.1812 - val_loss: 10.7702 - val_mae: 2.3626 - val_mse: 10.7702 354/354 [==============================] - 0s 173us/sample - loss: 6.1812 - mae: 1.7704 - mse: 6.1812 - val_loss: 10.7702 - val_mae: 2.3626 - val_mse: 10.7702
Epoch 70/100 Epoch 70/100
354/354 [==============================] - 0s 171us/sample - loss: 6.1634 - mae: 1.8019 - mse: 6.1634 - val_loss: 9.6836 - val_mae: 2.3618 - val_mse: 9.6836 354/354 [==============================] - 0s 171us/sample - loss: 6.1634 - mae: 1.8019 - mse: 6.1634 - val_loss: 9.6836 - val_mae: 2.3618 - val_mse: 9.6836
Epoch 71/100 Epoch 71/100
354/354 [==============================] - 0s 169us/sample - loss: 6.0410 - mae: 1.7080 - mse: 6.0410 - val_loss: 9.8525 - val_mae: 2.3718 - val_mse: 9.8525 354/354 [==============================] - 0s 169us/sample - loss: 6.0410 - mae: 1.7080 - mse: 6.0410 - val_loss: 9.8525 - val_mae: 2.3718 - val_mse: 9.8525
Epoch 72/100 Epoch 72/100
354/354 [==============================] - 0s 166us/sample - loss: 5.7556 - mae: 1.7068 - mse: 5.7556 - val_loss: 11.4228 - val_mae: 2.4962 - val_mse: 11.4228 354/354 [==============================] - 0s 166us/sample - loss: 5.7556 - mae: 1.7068 - mse: 5.7556 - val_loss: 11.4228 - val_mae: 2.4962 - val_mse: 11.4228
Epoch 73/100 Epoch 73/100
354/354 [==============================] - 0s 176us/sample - loss: 5.8854 - mae: 1.7138 - mse: 5.8854 - val_loss: 9.8943 - val_mae: 2.4214 - val_mse: 9.8943 354/354 [==============================] - 0s 176us/sample - loss: 5.8854 - mae: 1.7138 - mse: 5.8854 - val_loss: 9.8943 - val_mae: 2.4214 - val_mse: 9.8943
Epoch 74/100 Epoch 74/100
354/354 [==============================] - 0s 177us/sample - loss: 5.6033 - mae: 1.6994 - mse: 5.6033 - val_loss: 10.2695 - val_mae: 2.3981 - val_mse: 10.2695 354/354 [==============================] - 0s 177us/sample - loss: 5.6033 - mae: 1.6994 - mse: 5.6033 - val_loss: 10.2695 - val_mae: 2.3981 - val_mse: 10.2695
Epoch 75/100 Epoch 75/100
354/354 [==============================] - 0s 173us/sample - loss: 5.7909 - mae: 1.6973 - mse: 5.7909 - val_loss: 10.0138 - val_mae: 2.3440 - val_mse: 10.0138 354/354 [==============================] - 0s 173us/sample - loss: 5.7909 - mae: 1.6973 - mse: 5.7909 - val_loss: 10.0138 - val_mae: 2.3440 - val_mse: 10.0138
Epoch 76/100 Epoch 76/100
354/354 [==============================] - 0s 171us/sample - loss: 5.4470 - mae: 1.6519 - mse: 5.4470 - val_loss: 9.7148 - val_mae: 2.4004 - val_mse: 9.7148 354/354 [==============================] - 0s 171us/sample - loss: 5.4470 - mae: 1.6519 - mse: 5.4470 - val_loss: 9.7148 - val_mae: 2.4004 - val_mse: 9.7148
Epoch 77/100 Epoch 77/100
354/354 [==============================] - 0s 176us/sample - loss: 5.6775 - mae: 1.6463 - mse: 5.6775 - val_loss: 10.6783 - val_mae: 2.3670 - val_mse: 10.6783 354/354 [==============================] - 0s 176us/sample - loss: 5.6775 - mae: 1.6463 - mse: 5.6775 - val_loss: 10.6783 - val_mae: 2.3670 - val_mse: 10.6783
Epoch 78/100 Epoch 78/100
354/354 [==============================] - 0s 172us/sample - loss: 5.4289 - mae: 1.7021 - mse: 5.4289 - val_loss: 10.2150 - val_mae: 2.3861 - val_mse: 10.2150 354/354 [==============================] - 0s 172us/sample - loss: 5.4289 - mae: 1.7021 - mse: 5.4289 - val_loss: 10.2150 - val_mae: 2.3861 - val_mse: 10.2150
Epoch 79/100 Epoch 79/100
354/354 [==============================] - 0s 166us/sample - loss: 5.4991 - mae: 1.6477 - mse: 5.4991 - val_loss: 9.6550 - val_mae: 2.3681 - val_mse: 9.6550 354/354 [==============================] - 0s 166us/sample - loss: 5.4991 - mae: 1.6477 - mse: 5.4991 - val_loss: 9.6550 - val_mae: 2.3681 - val_mse: 9.6550
Epoch 80/100 Epoch 80/100
354/354 [==============================] - 0s 176us/sample - loss: 5.3646 - mae: 1.6555 - mse: 5.3646 - val_loss: 11.0607 - val_mae: 2.4424 - val_mse: 11.0607 354/354 [==============================] - 0s 176us/sample - loss: 5.3646 - mae: 1.6555 - mse: 5.3646 - val_loss: 11.0607 - val_mae: 2.4424 - val_mse: 11.0607
Epoch 81/100 Epoch 81/100
354/354 [==============================] - 0s 174us/sample - loss: 5.3874 - mae: 1.6344 - mse: 5.3874 - val_loss: 11.2996 - val_mae: 2.6303 - val_mse: 11.2996 354/354 [==============================] - 0s 174us/sample - loss: 5.3874 - mae: 1.6344 - mse: 5.3874 - val_loss: 11.2996 - val_mae: 2.6303 - val_mse: 11.2996
Epoch 82/100 Epoch 82/100
354/354 [==============================] - 0s 167us/sample - loss: 5.3116 - mae: 1.6345 - mse: 5.3116 - val_loss: 10.2543 - val_mae: 2.3943 - val_mse: 10.2543 354/354 [==============================] - 0s 167us/sample - loss: 5.3116 - mae: 1.6345 - mse: 5.3116 - val_loss: 10.2543 - val_mae: 2.3943 - val_mse: 10.2543
Epoch 83/100 Epoch 83/100
354/354 [==============================] - 0s 166us/sample - loss: 5.1442 - mae: 1.6227 - mse: 5.1442 - val_loss: 10.5314 - val_mae: 2.3998 - val_mse: 10.5314 354/354 [==============================] - 0s 166us/sample - loss: 5.1442 - mae: 1.6227 - mse: 5.1442 - val_loss: 10.5314 - val_mae: 2.3998 - val_mse: 10.5314
Epoch 84/100 Epoch 84/100
354/354 [==============================] - 0s 171us/sample - loss: 5.2872 - mae: 1.6288 - mse: 5.2872 - val_loss: 9.8682 - val_mae: 2.3268 - val_mse: 9.8682 354/354 [==============================] - 0s 171us/sample - loss: 5.2872 - mae: 1.6288 - mse: 5.2872 - val_loss: 9.8682 - val_mae: 2.3268 - val_mse: 9.8682
Epoch 85/100 Epoch 85/100
354/354 [==============================] - 0s 170us/sample - loss: 5.1584 - mae: 1.6282 - mse: 5.1584 - val_loss: 10.2676 - val_mae: 2.4443 - val_mse: 10.2676 354/354 [==============================] - 0s 170us/sample - loss: 5.1584 - mae: 1.6282 - mse: 5.1584 - val_loss: 10.2676 - val_mae: 2.4443 - val_mse: 10.2676
Epoch 86/100 Epoch 86/100
354/354 [==============================] - 0s 173us/sample - loss: 5.0609 - mae: 1.6078 - mse: 5.0609 - val_loss: 10.0901 - val_mae: 2.4020 - val_mse: 10.0901 354/354 [==============================] - 0s 173us/sample - loss: 5.0609 - mae: 1.6078 - mse: 5.0609 - val_loss: 10.0901 - val_mae: 2.4020 - val_mse: 10.0901
Epoch 87/100 Epoch 87/100
354/354 [==============================] - 0s 163us/sample - loss: 5.1753 - mae: 1.6148 - mse: 5.1753 - val_loss: 10.7763 - val_mae: 2.3816 - val_mse: 10.7763 354/354 [==============================] - 0s 163us/sample - loss: 5.1753 - mae: 1.6148 - mse: 5.1753 - val_loss: 10.7763 - val_mae: 2.3816 - val_mse: 10.7763
Epoch 88/100 Epoch 88/100
354/354 [==============================] - 0s 169us/sample - loss: 5.0408 - mae: 1.6055 - mse: 5.0408 - val_loss: 10.1056 - val_mae: 2.3234 - val_mse: 10.1056 354/354 [==============================] - 0s 169us/sample - loss: 5.0408 - mae: 1.6055 - mse: 5.0408 - val_loss: 10.1056 - val_mae: 2.3234 - val_mse: 10.1056
Epoch 89/100 Epoch 89/100
354/354 [==============================] - 0s 173us/sample - loss: 5.0175 - mae: 1.6009 - mse: 5.0175 - val_loss: 9.6620 - val_mae: 2.3334 - val_mse: 9.6620 354/354 [==============================] - 0s 173us/sample - loss: 5.0175 - mae: 1.6009 - mse: 5.0175 - val_loss: 9.6620 - val_mae: 2.3334 - val_mse: 9.6620
Epoch 90/100 Epoch 90/100
354/354 [==============================] - 0s 173us/sample - loss: 4.7522 - mae: 1.5615 - mse: 4.7522 - val_loss: 9.8084 - val_mae: 2.3036 - val_mse: 9.8084 354/354 [==============================] - 0s 173us/sample - loss: 4.7522 - mae: 1.5615 - mse: 4.7522 - val_loss: 9.8084 - val_mae: 2.3036 - val_mse: 9.8084
Epoch 91/100 Epoch 91/100
354/354 [==============================] - 0s 169us/sample - loss: 4.8323 - mae: 1.5873 - mse: 4.8323 - val_loss: 10.7285 - val_mae: 2.4886 - val_mse: 10.7285 354/354 [==============================] - 0s 169us/sample - loss: 4.8323 - mae: 1.5873 - mse: 4.8323 - val_loss: 10.7285 - val_mae: 2.4886 - val_mse: 10.7285
Epoch 92/100 Epoch 92/100
354/354 [==============================] - 0s 165us/sample - loss: 4.8179 - mae: 1.5678 - mse: 4.8179 - val_loss: 10.1033 - val_mae: 2.3372 - val_mse: 10.1033 354/354 [==============================] - 0s 165us/sample - loss: 4.8179 - mae: 1.5678 - mse: 4.8179 - val_loss: 10.1033 - val_mae: 2.3372 - val_mse: 10.1033
Epoch 93/100 Epoch 93/100
354/354 [==============================] - 0s 168us/sample - loss: 4.7970 - mae: 1.5422 - mse: 4.7970 - val_loss: 9.8511 - val_mae: 2.3521 - val_mse: 9.8511 354/354 [==============================] - 0s 168us/sample - loss: 4.7970 - mae: 1.5422 - mse: 4.7970 - val_loss: 9.8511 - val_mae: 2.3521 - val_mse: 9.8511
Epoch 94/100 Epoch 94/100
354/354 [==============================] - 0s 180us/sample - loss: 4.7676 - mae: 1.5674 - mse: 4.7676 - val_loss: 10.1749 - val_mae: 2.4087 - val_mse: 10.1749 354/354 [==============================] - 0s 180us/sample - loss: 4.7676 - mae: 1.5674 - mse: 4.7676 - val_loss: 10.1749 - val_mae: 2.4087 - val_mse: 10.1749
Epoch 95/100 Epoch 95/100
354/354 [==============================] - 0s 170us/sample - loss: 4.7223 - mae: 1.5431 - mse: 4.7222 - val_loss: 10.2481 - val_mae: 2.3268 - val_mse: 10.2481 354/354 [==============================] - 0s 170us/sample - loss: 4.7223 - mae: 1.5431 - mse: 4.7222 - val_loss: 10.2481 - val_mae: 2.3268 - val_mse: 10.2481
Epoch 96/100 Epoch 96/100
354/354 [==============================] - 0s 164us/sample - loss: 4.6685 - mae: 1.5333 - mse: 4.6685 - val_loss: 10.7347 - val_mae: 2.5154 - val_mse: 10.7347 354/354 [==============================] - 0s 164us/sample - loss: 4.6685 - mae: 1.5333 - mse: 4.6685 - val_loss: 10.7347 - val_mae: 2.5154 - val_mse: 10.7347
Epoch 97/100 Epoch 97/100
354/354 [==============================] - 0s 177us/sample - loss: 4.5642 - mae: 1.5675 - mse: 4.5642 - val_loss: 11.3132 - val_mae: 2.4601 - val_mse: 11.3132 354/354 [==============================] - 0s 177us/sample - loss: 4.5642 - mae: 1.5675 - mse: 4.5642 - val_loss: 11.3132 - val_mae: 2.4601 - val_mse: 11.3132
Epoch 98/100 Epoch 98/100
354/354 [==============================] - 0s 177us/sample - loss: 4.3886 - mae: 1.4906 - mse: 4.3886 - val_loss: 12.2466 - val_mae: 2.7436 - val_mse: 12.2466 354/354 [==============================] - 0s 177us/sample - loss: 4.3886 - mae: 1.4906 - mse: 4.3886 - val_loss: 12.2466 - val_mae: 2.7436 - val_mse: 12.2466
Epoch 99/100 Epoch 99/100
354/354 [==============================] - 0s 177us/sample - loss: 4.4689 - mae: 1.5368 - mse: 4.4689 - val_loss: 10.4188 - val_mae: 2.3596 - val_mse: 10.4188 354/354 [==============================] - 0s 177us/sample - loss: 4.4689 - mae: 1.5368 - mse: 4.4689 - val_loss: 10.4188 - val_mae: 2.3596 - val_mse: 10.4188
Epoch 100/100 Epoch 100/100
354/354 [==============================] - 0s 168us/sample - loss: 4.6496 - mae: 1.5348 - mse: 4.6496 - val_loss: 10.0829 - val_mae: 2.3822 - val_mse: 10.0829 354/354 [==============================] - 0s 168us/sample - loss: 4.6496 - mae: 1.5348 - mse: 4.6496 - val_loss: 10.0829 - val_mae: 2.3822 - val_mse: 10.0829
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 6 - Evaluate ## Step 6 - Evaluate
### 6.1 - Model evaluation ### 6.1 - Model evaluation
MAE = Mean Absolute Error (between the labels and predictions) MAE = Mean Absolute Error (between the labels and predictions)
A mae equal to 3 represents an average error in prediction of $3k. A mae equal to 3 represents an average error in prediction of $3k.
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
score = model.evaluate(x_test, y_test, verbose=0) score = model.evaluate(x_test, y_test, verbose=0)
print('x_test / loss : {:5.4f}'.format(score[0])) print('x_test / loss : {:5.4f}'.format(score[0]))
print('x_test / mae : {:5.4f}'.format(score[1])) print('x_test / mae : {:5.4f}'.format(score[1]))
print('x_test / mse : {:5.4f}'.format(score[2])) print('x_test / mse : {:5.4f}'.format(score[2]))
``` ```
%% Output %% Output
x_test / loss : 10.0829 x_test / loss : 10.0829
x_test / mae : 2.3822 x_test / mae : 2.3822
x_test / mse : 10.0829 x_test / mse : 10.0829
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
### 6.2 - Training history ### 6.2 - Training history
What was the best result during our training ? What was the best result during our training ?
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
df=pd.DataFrame(data=history.history) df=pd.DataFrame(data=history.history)
df.describe() df.describe()
``` ```
%% Output %% Output
loss mae mse val_loss val_mae val_mse loss mae mse val_loss val_mae val_mse
count 100.000000 100.000000 100.000000 100.000000 100.000000 100.000000 count 100.000000 100.000000 100.000000 100.000000 100.000000 100.000000
mean 19.466892 2.462477 19.466893 18.670107 2.852570 18.670107 mean 19.466892 2.462477 19.466893 18.670107 2.852570 18.670107
std 64.483863 2.592690 64.483872 48.257937 2.039701 48.257935 std 64.483863 2.592690 64.483872 48.257937 2.039701 48.257935
min 4.388600 1.490624 4.388600 9.655048 2.303586 9.655047 min 4.388600 1.490624 4.388600 9.655048 2.303586 9.655047
25% 5.658976 1.698877 5.658976 10.269067 2.393491 10.269066 25% 5.658976 1.698877 5.658976 10.269067 2.393491 10.269066
50% 7.713175 1.945081 7.713175 10.750849 2.469115 10.750849 50% 7.713175 1.945081 7.713175 10.750849 2.469115 10.750849
75% 10.770471 2.242925 10.770470 12.249026 2.610316 12.249027 75% 10.770471 2.242925 10.770470 12.249026 2.610316 12.249027
max 536.084498 21.333506 536.084595 439.656211 19.319771 439.656189 max 536.084498 21.333506 536.084595 439.656211 19.319771 439.656189
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
print("min( val_mae ) : {:.4f}".format( min(history.history["val_mae"]) ) ) print("min( val_mae ) : {:.4f}".format( min(history.history["val_mae"]) ) )
``` ```
%% Output %% Output
min( val_mae ) : 2.3036 min( val_mae ) : 2.3036
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
ooo.plot_history(history, plot={'MSE' :['mse', 'val_mse'], ooo.plot_history(history, plot={'MSE' :['mse', 'val_mse'],
'MAE' :['mae', 'val_mae'], 'MAE' :['mae', 'val_mae'],
'LOSS':['loss','val_loss']}) 'LOSS':['loss','val_loss']})
``` ```
%% Output %% Output
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 7 - Make a prediction ## Step 7 - Make a prediction
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
my_data = [ 1.26425925, -0.48522739, 1.0436489 , -0.23112788, 1.37120745, my_data = [ 1.26425925, -0.48522739, 1.0436489 , -0.23112788, 1.37120745,
-2.14308942, 1.13489104, -1.06802005, 1.71189006, 1.57042287, -2.14308942, 1.13489104, -1.06802005, 1.71189006, 1.57042287,
0.77859951, 0.14769795, 2.7585581 ] 0.77859951, 0.14769795, 2.7585581 ]
real_price = 10.4 real_price = 10.4
my_data=np.array(my_data).reshape(1,13) my_data=np.array(my_data).reshape(1,13)
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
predictions = model.predict( my_data ) predictions = model.predict( my_data )
print("Prédiction : {:.2f} K$".format(predictions[0][0])) print("Prédiction : {:.2f} K$".format(predictions[0][0]))
print("Reality : {:.2f} K$".format(real_price)) print("Reality : {:.2f} K$".format(real_price))
``` ```
%% Output %% Output
Prédiction : 9.70 K$ Prédiction : 9.70 K$
Reality : 10.40 K$ Reality : 10.40 K$
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
--- ---
![](../fidle/img/00-Fidle-logo-01_s.png) <img width="80px" src="../fidle/img/00-Fidle-logo-01.svg"></img>
......
<!-- ![](fidle/img/00-Fidle-titre-01_m.png) -->
[<img width="600px" src="fidle/img/00-Fidle-titre-01.svg"></img>](#) [<img width="600px" src="fidle/img/00-Fidle-titre-01.svg"></img>](#)
## A propos ## A propos
...@@ -13,15 +12,16 @@ The objectives of this training, co-organized by the Formation Permanente CNRS a ...@@ -13,15 +12,16 @@ The objectives of this training, co-organized by the Formation Permanente CNRS a
- Apprehend the **academic computing environments** Tier-2 (meso) and/or Tier-1 (national) - Apprehend the **academic computing environments** Tier-2 (meso) and/or Tier-1 (national)
## Course materials ## Course materials
Get the **[course slides](Bientot)** **[<img width="50px" src="fidle/img/00-Fidle-pdf.svg"></img>
<img width="50px" src="fidle/img/00-Fidle-pdf.svg"></img> Get the course slides](https://cloud.univ-grenoble-alpes.fr/index.php/s/z7XZA36xKkMcaTS)**
<!-- ![pdf](fidle/img/00-Fidle-pdf.png) --> <!-- ![pdf](fidle/img/00-Fidle-pdf.png) -->
Useful information is also available in the [wiki](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/home) Useful information is also available in the [wiki](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/home)
**Jupyter notebooks :** ## Jupyter notebooks
[![Binder](https://mybinder.org/badge_logo.svg)](https://mybinder.org/v2/git/https%3A%2F%2Fgricad-gitlab.univ-grenoble-alpes.fr%2Ftalks%2Fdeeplearning.git/master?urlpath=lab/tree/index.ipynb) [![Binder](https://mybinder.org/badge_logo.svg)](https://mybinder.org/v2/git/https%3A%2F%2Fgricad-gitlab.univ-grenoble-alpes.fr%2Ftalks%2Fdeeplearning.git/master?urlpath=lab/tree/index.ipynb)
...@@ -66,23 +66,26 @@ Useful information is also available in the [wiki](https://gricad-gitlab.univ-gr ...@@ -66,23 +66,26 @@ Useful information is also available in the [wiki](https://gricad-gitlab.univ-gr
## Installation ## Installation
To run this examples, you need an environment with the following packages : To run this examples, you need a Python environment, with the following packages :
- Python >3.5 - Python >3.5
- numpy - Numpy
- Tensorflow 2.0 - Tensorflow 2.0
- scikit-image - Scikit-image
- scikit-learn - Scikit-learn
- Matplotlib - Matplotlib
- seaborn - Seaborn
- pyplot - pyplot
You can install such a predefined environment : For this, you can use the [Anaconda](https://www.anaconda.com) distribution :
```
conda env create -f environment.yml
```
To manage conda environment see [there](https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html#) 1. Installing Anaconda :
https://www.anaconda.com/distribution
2. Installing the Fidle conda environment :
`# conda env create -f environment.yml`
To manage conda environment see [there](https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html#)
To start Jupyter Lab on your machine or on a GRICAD cluster, see Wiki
## Licence ## Licence
...@@ -90,4 +93,8 @@ To manage conda environment see [there](https://docs.conda.io/projects/conda/en/ ...@@ -90,4 +93,8 @@ To manage conda environment see [there](https://docs.conda.io/projects/conda/en/
\[en\] Attribution - NonCommercial - ShareAlike 4.0 International (CC BY-NC-SA 4.0) \[en\] Attribution - NonCommercial - ShareAlike 4.0 International (CC BY-NC-SA 4.0)
\[Fr\] Attribution - Pas d’Utilisation Commerciale - Partage dans les Mêmes Conditions 4.0 International \[Fr\] Attribution - Pas d’Utilisation Commerciale - Partage dans les Mêmes Conditions 4.0 International
See [License](https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode). See [License](https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode).
See [Disclaimer](https://creativecommons.org/licenses/by-nc-sa/4.0/#). See [Disclaimer](https://creativecommons.org/licenses/by-nc-sa/4.0/#).
\ No newline at end of file
----
[<img width="80px" src="fidle/img/00-Fidle-logo-01.svg"></img>](#)
\ No newline at end of file
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 43.2611 7.9565"><title>00-Anaconda</title><g id="Calque_2" data-name="Calque 2"><g id="Calque_4" data-name="Calque 4"><path d="M11.4842,1.6617a.1046.1046,0,0,0-.1041-.0708H11.319a.11.11,0,0,0-.1041.0708L9.3122,5.9951a.1132.1132,0,0,0,.104.1674h.5324a.1853.1853,0,0,0,.1773-.1287l.3-.6954h1.8291l.3.6954c.043.09.0857.1287.1777.1287h.5322a.1131.1131,0,0,0,.1038-.1674Zm-.7587,2.9619.6-1.4166h.0182l.6119,1.4166Z" style="fill:#2cb34a"/><path d="M17.3167,1.6554h-.5629a.1194.1194,0,0,0-.1163.1222V4.437h-.0061L14.0863,1.5908h-.1527a.1179.1179,0,0,0-.1164.116V6.04a.1234.1234,0,0,0,.1164.1221H14.49a.1192.1192,0,0,0,.1161-.1221V3.2715h.0061L17.17,6.2269h.1468a.1179.1179,0,0,0,.1161-.116V1.7776a.1235.1235,0,0,0-.1161-.1222" style="fill:#2cb34a"/><path d="M20.0653,1.6617a.1039.1039,0,0,0-.1038-.0708H19.9a.11.11,0,0,0-.1038.0708L17.8936,5.9951a.1132.1132,0,0,0,.104.1674H18.53a.1853.1853,0,0,0,.1772-.1287l.3-.6954h1.8291l.3.6954c.0429.09.0857.1287.1775.1287h.5323a.1133.1133,0,0,0,.1041-.1674Zm-.7584,2.9619.6-1.4166h.0182l.6119,1.4166Z" style="fill:#2cb34a"/><path d="M25.2943,5.0163a.1085.1085,0,0,0-.1529,0,1.3756,1.3756,0,0,1-2.3066-1.12,1.4284,1.4284,0,0,1,1.3645-1.5,1.4075,1.4075,0,0,1,.9421.38.095.095,0,0,0,.1529,0l.3733-.4056a.1241.1241,0,0,0-.0061-.18,2.0148,2.0148,0,0,0-1.4806-.5988,2.2637,2.2637,0,0,0-2.2086,2.3244,2.2523,2.2523,0,0,0,2.2086,2.3114,2.045,2.045,0,0,0,1.4867-.6246.1271.1271,0,0,0,.0059-.1738Z" style="fill:#2cb34a"/><path d="M28.1853,1.5909A2.2537,2.2537,0,0,0,25.989,3.9153a2.2021,2.2021,0,1,0,4.3987,0,2.2586,2.2586,0,0,0-2.2024-2.3244m0,3.7989a1.4429,1.4429,0,0,1-1.4011-1.4745,1.4065,1.4065,0,1,1,2.8083,0A1.448,1.448,0,0,1,28.1853,5.39" style="fill:#2cb34a"/><path d="M34.5566,1.6554h-.5629a.1193.1193,0,0,0-.1161.1222V4.437h-.0063L31.3262,1.5908h-.1527a.1179.1179,0,0,0-.1164.116V6.04a.1234.1234,0,0,0,.1164.1221H31.73a.1193.1193,0,0,0,.1163-.1221V3.2715h.0059L34.41,6.2269h.1468a.118.118,0,0,0,.1163-.116V1.7776a.1237.1237,0,0,0-.1163-.1222" style="fill:#2cb34a"/><path d="M37.0109,1.6553h-1.45a.1185.1185,0,0,0-.11.1225V6.04a.1186.1186,0,0,0,.11.1224h1.45a2.2087,2.2087,0,0,0,2.1475-2.26,2.2054,2.2054,0,0,0-2.1475-2.2472M36.9375,5.377H36.24V2.4344h.6973A1.3868,1.3868,0,0,1,38.314,3.9025,1.3887,1.3887,0,0,1,36.9375,5.377" style="fill:#2cb34a"/><path d="M43.2489,5.9951,41.3643,1.6617a.1039.1039,0,0,0-.1038-.0708h-.0614a.11.11,0,0,0-.1038.0708L39.1926,5.9951a.1132.1132,0,0,0,.104.1674h.5324a.1853.1853,0,0,0,.1772-.1287l.3-.6954h1.8291l.3.6954c.0429.09.0857.1287.1775.1287h.5323a.1133.1133,0,0,0,.1041-.1674m-2.643-1.3715.5994-1.4166h.0184l.6119,1.4166Z" style="fill:#2cb34a"/><path d="M1.405,6.09l.0007-.0291a5.9128,5.9128,0,0,1,.0523-.683l.0029-.0208L1.4421,5.35a5.1889,5.1889,0,0,1-.5849-.2972L.8333,5.0384l-.012.0253a5.5106,5.5106,0,0,0-.31.8243l-.008.0266.0257.0074a4.8763,4.8763,0,0,0,.8482.1655Z" style="fill:#2cb34a"/><path d="M1.99,2.0554l.0012-.0069a4.76,4.76,0,0,0-.4917.0349q.0316.2611.0874.5175A2.709,2.709,0,0,1,1.99,2.0554" style="fill:#2cb34a"/><path d="M1.405,6.2972l0-.0241-.023-.0024a5.0108,5.0108,0,0,1-.7255-.1275l-.0634-.016.0359.0572a3.9568,3.9568,0,0,0,.79.913l.05.0423L1.46,7.073a7.4835,7.4835,0,0,1-.055-.7758" style="fill:#2cb34a"/><path d="M2.5808.2144a3.6893,3.6893,0,0,0-.83.4245,5.0891,5.0891,0,0,1,.574.1418A5.75,5.75,0,0,1,2.5808.2144" style="fill:#2cb34a"/><path d="M3.8047,0a3.629,3.629,0,0,0-.4308.0265A5.4681,5.4681,0,0,1,3.9474.51L4.0979.656,3.95.8052a5.18,5.18,0,0,0-.3547.4L3.59,1.2117c-.0013.0014-.0225.0263-.0577.0715a2.4525,2.4525,0,0,1,.2724-.0153A2.6454,2.6454,0,0,1,6.38,3.9782a2.6454,2.6454,0,0,1-2.5754,2.71,2.47,2.47,0,0,1-1.3462-.4,5.0376,5.0376,0,0,1-.5822.0344q-.1356,0-.2713-.0074a7.3737,7.3737,0,0,0,.0839.9592,3.63,3.63,0,0,0,2.1158.6813,3.8829,3.8829,0,0,0,3.78-3.9783A3.8829,3.8829,0,0,0,3.8047,0" style="fill:#2cb34a"/><path d="M3.3036.94c.0752-.0939.1531-.1848.2324-.2721a5.0175,5.0175,0,0,0-.56-.434A5.3225,5.3225,0,0,0,2.6619.897a5.22,5.22,0,0,1,.499.2243c.0752-.1016.1277-.1635.1427-.181" style="fill:#2cb34a"/><path d="M.7378,3.21l.0146.0246.0225-.0169a5.2419,5.2419,0,0,1,.5473-.3626l.0178-.01-.005-.0205A5.7421,5.7421,0,0,1,1.2073,2.13l-.0034-.0283-.0268.0051a4.8359,4.8359,0,0,0-.8219.2341L.33,2.3505l.01.0259A5.4723,5.4723,0,0,0,.7378,3.21" style="fill:#2cb34a"/><path d="M.6945,3.5958.6729,3.614A5.1977,5.1977,0,0,0,.1,4.1894l-.0174.02.0195.0182a5.1142,5.1142,0,0,0,.6285.4933l.0227.0153.0134-.0249A5.636,5.636,0,0,1,1.1,4.1666l.0121-.0176-.0139-.0163A5.5776,5.5776,0,0,1,.71,3.62Z" style="fill:#2cb34a"/><path d="M2.1768,6.1039,2.2425,6.1l-.0509-.044a2.6444,2.6444,0,0,1-.52-.6084L1.67,5.44l-.042-.0193-.0048.0375a5.8412,5.8412,0,0,0-.0443.62l0,.0261.0245.0017q.1347.0075.2707.0074.1513,0,.3026-.0093" style="fill:#2cb34a"/><path d="M2.0508,1.71A5.8753,5.8753,0,0,1,2.2021,1.13,4.7579,4.7579,0,0,0,1.4768.97a5.5386,5.5386,0,0,0-.0079.7769A5.1041,5.1041,0,0,1,2.0508,1.71" style="fill:#2cb34a"/><path d="M2.4156,1.6965a2.5092,2.5092,0,0,1,.5292-.2733q-.1993-.0987-.4067-.1789-.0709.2235-.1225.4522" style="fill:#2cb34a"/><path d="M.6569,4.9276l-.02-.0131a5.3287,5.3287,0,0,1-.5615-.4271l-.0506-.044.0086.0684A4.1558,4.1558,0,0,0,.3317,5.6161l.0266.0622L.38,5.6137A5.73,5.73,0,0,1,.6462,4.95Z" style="fill:#2cb34a"/><path d="M1.1431,1.1538a3.9806,3.9806,0,0,0-.5866.79A5.1388,5.1388,0,0,1,1.15,1.7959a5.9244,5.9244,0,0,1-.0069-.6421" style="fill:#2cb34a"/><path d="M1.255,3.9424l.0025-.0687a2.807,2.807,0,0,1,.1172-.709l.02-.0667-.0575.0349a5.176,5.176,0,0,0-.4333.2951l-.02.0158.0143.0217c.0975.1461.2034.2891.3147.425Z" style="fill:#2cb34a"/><path d="M1.28,4.3529l-.0091-.0672-.0373.0553a5.4251,5.4251,0,0,0-.2913.4825L.93,4.847.9526,4.86a4.9141,4.9141,0,0,0,.4943.2563l.06.0273L1.48,5.0808a2.7907,2.7907,0,0,1-.2-.7279" style="fill:#2cb34a"/><path d="M.5446,3.4015l.018-.0151-.012-.02a5.6734,5.6734,0,0,1-.32-.6227L.2034,2.68l-.02.066a4.1851,4.1851,0,0,0-.1813,1.117L0,3.9333l.0459-.05a5.4585,5.4585,0,0,1,.4987-.4813" style="fill:#2cb34a"/></g></g></svg>
\ No newline at end of file
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 98.0859 23.1622"><title>00-Fidle-Anaconda</title><g id="Calque_2" data-name="Calque 2"><g id="Contains_1" data-name="Contains #1"><rect x="0.6697" y="0.6697" width="96.7465" height="19.2734" rx="2.2368" style="fill:none;stroke:#2cb34a;stroke-miterlimit:10;stroke-width:1.3393646861416477px"/><g id="Calque_4" data-name="Calque 4"><path d="M25.2956,5.966A.1966.1966,0,0,0,25.1,5.833h-.1148a.2067.2067,0,0,0-.1955.133l-3.5742,8.14a.2127.2127,0,0,0,.1954.3145h1a.3481.3481,0,0,0,.333-.2418l.5636-1.3063h3.4359l.5635,1.3063c.0808.1691.161.2418.3339.2418h1a.2125.2125,0,0,0,.195-.3145ZM23.87,11.53l1.1271-2.6611h.0341L26.1811,11.53Z" style="fill:#2cb34a"/><path d="M36.2518,5.9542H35.1944a.2243.2243,0,0,0-.2185.2295v4.9957h-.0114L30.1836,5.8328h-.2869a.2214.2214,0,0,0-.2186.2179v8.14a.2316.2316,0,0,0,.2186.2293h1.0452a.224.224,0,0,0,.2181-.2293V8.99h.0114l4.8048,5.5516h.2758a.2214.2214,0,0,0,.2181-.2179v-8.14a.232.232,0,0,0-.2181-.2295" style="fill:#2cb34a"/><path d="M41.415,5.966a.1953.1953,0,0,0-.195-.133h-.1155a.2066.2066,0,0,0-.195.133l-3.574,8.14a.2127.2127,0,0,0,.1953.3145h1a.3482.3482,0,0,0,.3329-.2418l.5636-1.3063h3.4359l.5635,1.3063c.0806.1691.161.2418.3334.2418h1a.2128.2128,0,0,0,.1955-.3145ZM39.99,11.53l1.1271-2.6611h.0342L42.301,11.53Z" style="fill:#2cb34a"/><path d="M51.2375,12.2676a.2036.2036,0,0,0-.2872,0,2.5841,2.5841,0,0,1-4.3329-2.1039A2.6832,2.6832,0,0,1,49.16,7.3469l.02-.0009a2.6435,2.6435,0,0,1,1.77.7138.1784.1784,0,0,0,.25.0376.1811.1811,0,0,0,.0376-.0376l.7013-.7619a.2332.2332,0,0,0-.0024-.33L51.9273,6.96A3.7849,3.7849,0,0,0,49.146,5.8349a4.2523,4.2523,0,0,0-4.1491,4.353l0,.0133a4.2309,4.2309,0,0,0,4.1177,4.3412l.0311.0007A3.8412,3.8412,0,0,0,51.9388,13.37a.2388.2388,0,0,0,.011-.3265Z" style="fill:#2cb34a"/><path d="M56.6682,5.833a4.2335,4.2335,0,0,0-4.1265,4.3379l.0008.0285a4.1366,4.1366,0,1,0,8.2629,0,4.2429,4.2429,0,0,0-4.1161-4.3658l-.0211-.0006m0,7.1362a2.71,2.71,0,0,1-2.6319-2.77,2.6421,2.6421,0,1,1,5.2753,0,2.72,2.72,0,0,1-2.6434,2.77" style="fill:#2cb34a"/><path d="M68.6365,5.9542H67.5791a.224.224,0,0,0-.218.2295v4.9957h-.0119L62.5683,5.8328h-.2868a.2215.2215,0,0,0-.2187.2179v8.14a.2317.2317,0,0,0,.2187.2293h1.0453a.224.224,0,0,0,.2185-.2293V8.99h.0111l4.8048,5.5516h.2757a.2216.2216,0,0,0,.2185-.2179v-8.14a.2324.2324,0,0,0-.2185-.2295" style="fill:#2cb34a"/><path d="M73.2469,5.954H70.5231a.2225.2225,0,0,0-.2066.23v8.0065a.2227.2227,0,0,0,.2066.23h2.7238a4.1489,4.1489,0,0,0,4.034-4.2454,4.1427,4.1427,0,0,0-4.034-4.2213m-.1379,6.9913h-1.31V7.4175h1.31a2.6051,2.6051,0,0,1,2.5906,2.62q0,.0693-.0045.1383A2.6085,2.6085,0,0,1,73.2523,12.94q-.0716.0045-.1433.0049" style="fill:#2cb34a"/><path d="M84.9648,14.1062l-3.54-8.14a.1954.1954,0,0,0-.195-.133h-.1154a.2067.2067,0,0,0-.195.133l-3.5741,8.14a.2127.2127,0,0,0,.1953.3145h1a.3482.3482,0,0,0,.3329-.2418l.5635-1.3063h3.436l.5635,1.3063c.0806.1691.161.2418.3334.2418h1a.2129.2129,0,0,0,.1956-.3145M80.0006,11.53l1.1259-2.6611h.0346L82.3105,11.53Z" style="fill:#2cb34a"/><path d="M6.362,14.2845l.0013-.0547a11.1264,11.1264,0,0,1,.0983-1.283l.0054-.0391-.0353-.0133a9.7437,9.7437,0,0,1-1.0987-.5583l-.0449-.027-.0225.0475a10.3564,10.3564,0,0,0-.5824,1.5484l-.015.05.0483.0139A9.159,9.159,0,0,0,6.31,14.28Z" style="fill:#2cb34a"/><path d="M7.4609,6.7056l.0023-.013a8.9453,8.9453,0,0,0-.9237.0656q.0594.49.1642.9721a5.0894,5.0894,0,0,1,.7572-1.0247" style="fill:#2cb34a"/><path d="M6.362,14.6737v-.0453l-.0432-.0045a9.41,9.41,0,0,1-1.3628-.24l-.1191-.03.0674.1074a7.4327,7.4327,0,0,0,1.484,1.7151l.0939.0794-.0169-.1253a14.09,14.09,0,0,1-.1033-1.4573" style="fill:#2cb34a"/><path d="M8.5707,3.2473a6.933,6.933,0,0,0-1.5591.7974A9.5535,9.5535,0,0,1,8.09,4.3111a10.8416,10.8416,0,0,1,.4809-1.0638" style="fill:#2cb34a"/><path d="M10.87,2.8446a6.8181,6.8181,0,0,0-.8092.05,10.2828,10.2828,0,0,1,1.0773.9083l.2827.2742-.2779.28a9.7318,9.7318,0,0,0-.6662.7514l-.01.0122c-.0024.0026-.0423.0494-.1084.1343a4.604,4.604,0,0,1,.5117-.0287,4.9692,4.9692,0,0,1,4.8376,5.0912A4.9692,4.9692,0,0,1,10.87,15.4082a4.64,4.64,0,0,1-2.5288-.7514,9.4711,9.4711,0,0,1-1.0936.0646q-.2549,0-.51-.0139a13.87,13.87,0,0,0,.1576,1.8018,6.8191,6.8191,0,0,0,3.9745,1.28A7.2942,7.2942,0,0,0,17.97,10.316,7.2939,7.2939,0,0,0,10.87,2.8446" style="fill:#2cb34a"/><path d="M9.9285,4.61c.1413-.1764.2876-.3471.4366-.5111a9.4369,9.4369,0,0,0-1.052-.8153,10.0084,10.0084,0,0,0-.59,1.2456,9.8146,9.8146,0,0,1,.9373.4214c.1413-.1909.24-.3071.2681-.34" style="fill:#2cb34a"/><path d="M5.1087,8.8745l.0274.0462.0423-.0318a9.8369,9.8369,0,0,1,1.0281-.6811L6.24,8.189,6.2305,8.15a10.8127,10.8127,0,0,1-.24-1.3048l-.0063-.0531-.05.01a9.0877,9.0877,0,0,0-1.5439.44l-.0473.018.0187.0487a10.2861,10.2861,0,0,0,.7473,1.5659" style="fill:#2cb34a"/><path d="M5.0274,9.5992l-.0406.0342a9.7542,9.7542,0,0,0-1.0762,1.0808l-.0327.0376.0367.0342a9.609,9.609,0,0,0,1.1806.9267l.0426.0287.0252-.0468a10.59,10.59,0,0,1,.6261-1.0232l.0227-.033-.0261-.0307a10.4605,10.4605,0,0,1-.7292-.9631Z" style="fill:#2cb34a"/><path d="M7.8118,14.3106l.1234-.0073L7.84,14.2206a4.9675,4.9675,0,0,1-.9768-1.1429l-.003-.0142-.0789-.0363-.009.07a10.9551,10.9551,0,0,0-.0832,1.1646v.049l.046.0032q.2531.0141.5085.0139.2843,0,.5684-.0174" style="fill:#2cb34a"/><path d="M7.5751,6.0568a11.03,11.03,0,0,1,.2843-1.09,8.94,8.94,0,0,0-1.3625-.3005,10.3927,10.3927,0,0,0-.0148,1.4594,9.5772,9.5772,0,0,1,1.093-.0693" style="fill:#2cb34a"/><path d="M8.26,6.0314a4.7143,4.7143,0,0,1,.9941-.5134q-.3744-.1854-.764-.3361-.1332.42-.23.8495" style="fill:#2cb34a"/><path d="M4.9567,12.1009l-.0375-.0246a9.9908,9.9908,0,0,1-1.0548-.8023l-.095-.0826.0161.1285a7.8048,7.8048,0,0,0,.56,2.0744l.05.1168.0408-.1213a10.7684,10.7684,0,0,1,.5-1.2468Z" style="fill:#2cb34a"/><path d="M5.87,5.0119a7.481,7.481,0,0,0-1.102,1.484A9.67,9.67,0,0,1,5.883,6.2181,11.1319,11.1319,0,0,1,5.87,5.0119" style="fill:#2cb34a"/><path d="M6.08,10.25l.0046-.1291a5.2787,5.2787,0,0,1,.22-1.3318l.0376-.1253-.108.0655a9.7284,9.7284,0,0,0-.814.5544l-.0375.03.0268.0407c.1832.2745.3821.5431.5912.7984Z" style="fill:#2cb34a"/><path d="M6.1272,11.0214,6.11,10.8951l-.07.1039a10.1788,10.1788,0,0,0-.5472.9064L5.47,11.95l.0425.0245a9.24,9.24,0,0,0,.9285.4814l.1127.0513-.05-.118a5.24,5.24,0,0,1-.3757-1.3673" style="fill:#2cb34a"/><path d="M4.7458,9.2342,4.78,9.2058l-.0225-.0375a10.6568,10.6568,0,0,1-.6012-1.17l-.0511-.12-.0375.124a7.86,7.86,0,0,0-.3406,2.0982l-.0039.1321.0862-.094a10.266,10.266,0,0,1,.9368-.9041" style="fill:#2cb34a"/></g><path d="M83.4285,16.2586h2.7734V10.3963a.8925.8925,0,0,1,.8916-.8916h4.688a.8936.8936,0,0,1,.8916.8877L92.7,16.2586h2.57L89.5022,22.742Z" style="fill:#e12229"/><path d="M91.7814,9.7881a.6085.6085,0,0,1,.6084.6057l.0277,6.148h2.2208l-5.1413,5.78-5.4153-5.78h2.4034V10.3965a.6083.6083,0,0,1,.6084-.6084h4.6879m0-.5664H87.0935a1.1762,1.1762,0,0,0-1.1748,1.1748v5.5789H82.775l.8933.9536,5.4153,5.78.4242.4529.4124-.4637,5.1413-5.78.8385-.9429H92.9813l-.0251-5.584a1.1773,1.1773,0,0,0-1.1748-1.17Z" style="fill:#fff"/></g></g></svg>
\ No newline at end of file
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment