Skip to content
Snippets Groups Projects
Commit 98fccb2d authored by Jean-Luc Parouty's avatar Jean-Luc Parouty
Browse files

Update notebooks for continous integration

parent 92bf2558
No related branches found
No related tags found
No related merge requests found
source diff could not be displayed: it is too large. Options to address this: view the blob.
source diff could not be displayed: it is too large. Options to address this: view the blob.
This diff is collapsed.
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ``` python
from IPython.display import display,Markdown from IPython.display import display,Markdown
display(Markdown(open('README.md', 'r').read())) display(Markdown(open('README.md', 'r').read()))
# #
# This README is visible under Jupiter LAb ! :-) # This README is visible under Jupiter LAb ! :-)
``` ```
%% Output %% Output
<a name="top"></a> <a name="top"></a>
[<img width="600px" src="fidle/img/00-Fidle-titre-01.svg"></img>](#top) [<img width="600px" src="fidle/img/00-Fidle-titre-01.svg"></img>](#top)
<!-- --------------------------------------------------- --> <!-- --------------------------------------------------- -->
<!-- To correctly view this README under Jupyter Lab --> <!-- To correctly view this README under Jupyter Lab -->
<!-- Open the notebook: README.ipynb! --> <!-- Open the notebook: README.ipynb! -->
<!-- --------------------------------------------------- --> <!-- --------------------------------------------------- -->
## A propos ## A propos
This repository contains all the documents and links of the **Fidle Training** . This repository contains all the documents and links of the **Fidle Training** .
Fidle (for Formation Introduction au Deep Learning) is a 2-day training session Fidle (for Formation Introduction au Deep Learning) is a 2-day training session
co-organized by the Formation Permanente CNRS and the SARI and DEVLOG networks. co-organized by the Formation Permanente CNRS and the SARI and DEVLOG networks.
The objectives of this training are : The objectives of this training are :
- Understanding the **bases of Deep Learning** neural networks - Understanding the **bases of Deep Learning** neural networks
- Develop a **first experience** through simple and representative examples - Develop a **first experience** through simple and representative examples
- Understanding **Tensorflow/Keras** and **Jupyter lab** technologies - Understanding **Tensorflow/Keras** and **Jupyter lab** technologies
- Apprehend the **academic computing environments** Tier-2 or Tier-1 with powerfull GPU - Apprehend the **academic computing environments** Tier-2 or Tier-1 with powerfull GPU
For more information, you can contact us at : For more information, you can contact us at :
[<img width="200px" style="vertical-align:middle" src="fidle/img/00-Mail_contact.svg"></img>](#top) [<img width="200px" style="vertical-align:middle" src="fidle/img/00-Mail_contact.svg"></img>](#top)
Current Version : <!-- VERSION_BEGIN --> Current Version : <!-- VERSION_BEGIN -->
0.6.1 DEV 0.6.1 DEV
<!-- VERSION_END --> <!-- VERSION_END -->
## Course materials ## Course materials
| | | | | | | |
|:--:|:--:|:--:| |:--:|:--:|:--:|
| **[<img width="50px" src="fidle/img/00-Fidle-pdf.svg"></img><br>Course slides](https://cloud.univ-grenoble-alpes.fr/index.php/s/wxCztjYBbQ6zwd6)**<br>The course in pdf format<br>(12 Mo)| **[<img width="50px" src="fidle/img/00-Notebooks.svg"></img><br>Notebooks](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/archive/master/fidle-master.zip)**<br> &nbsp;&nbsp;&nbsp;&nbsp;Get a Zip or clone this repository &nbsp;&nbsp;&nbsp;&nbsp;<br>(10 Mo)| **[<img width="50px" src="fidle/img/00-Datasets-tar.svg"></img><br>Datasets](https://cloud.univ-grenoble-alpes.fr/index.php/s/wxCztjYBbQ6zwd6)**<br>All the needed datasets<br>(1.2 Go)| | **[<img width="50px" src="fidle/img/00-Fidle-pdf.svg"></img><br>Course slides](https://cloud.univ-grenoble-alpes.fr/index.php/s/wxCztjYBbQ6zwd6)**<br>The course in pdf format<br>(12 Mo)| **[<img width="50px" src="fidle/img/00-Notebooks.svg"></img><br>Notebooks](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/archive/master/fidle-master.zip)**<br> &nbsp;&nbsp;&nbsp;&nbsp;Get a Zip or clone this repository &nbsp;&nbsp;&nbsp;&nbsp;<br>(10 Mo)| **[<img width="50px" src="fidle/img/00-Datasets-tar.svg"></img><br>Datasets](https://cloud.univ-grenoble-alpes.fr/index.php/s/wxCztjYBbQ6zwd6)**<br>All the needed datasets<br>(1.2 Go)|
Have a look about **[How to get and install](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Install-Fidle)** these notebooks and datasets. Have a look about **[How to get and install](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Install-Fidle)** these notebooks and datasets.
## Jupyter notebooks ## Jupyter notebooks
<!-- INDEX_BEGIN --> <!-- INDEX_BEGIN -->
| | | | | |
|--|--| |--|--|
|LINR1| [Linear regression with direct resolution](LinearReg/01-Linear-Regression.ipynb)<br>Direct determination of linear regression | |LINR1| [Linear regression with direct resolution](LinearReg/01-Linear-Regression.ipynb)<br>Direct determination of linear regression |
|GRAD1| [Linear regression with gradient descent](LinearReg/02-Gradient-descent.ipynb)<br>An example of gradient descent in the simple case of a linear regression.| |GRAD1| [Linear regression with gradient descent](LinearReg/02-Gradient-descent.ipynb)<br>An example of gradient descent in the simple case of a linear regression.|
|POLR1| [Complexity Syndrome](LinearReg/03-Polynomial-Regression.ipynb)<br>Illustration of the problem of complexity with the polynomial regression| |POLR1| [Complexity Syndrome](LinearReg/03-Polynomial-Regression.ipynb)<br>Illustration of the problem of complexity with the polynomial regression|
|LOGR1| [Logistic regression, in pure Tensorflow](LinearReg/04-Logistic-Regression.ipynb)<br>Logistic Regression with Mini-Batch Gradient Descent using pure TensorFlow. | |LOGR1| [Logistic regression, in pure Tensorflow](LinearReg/04-Logistic-Regression.ipynb)<br>Logistic Regression with Mini-Batch Gradient Descent using pure TensorFlow. |
|PER57| [Perceptron Model 1957](IRIS/01-Simple-Perceptron.ipynb)<br>A simple perceptron, with the IRIS dataset.| |PER57| [Perceptron Model 1957](IRIS/01-Simple-Perceptron.ipynb)<br>A simple perceptron, with the IRIS dataset.|
|BHP1| [Regression with a Dense Network (DNN)](BHPD/01-DNN-Regression.ipynb)<br>A Simple regression with a Dense Neural Network (DNN) - BHPD dataset| |BHP1| [Regression with a Dense Network (DNN)](BHPD/01-DNN-Regression.ipynb)<br>A Simple regression with a Dense Neural Network (DNN) - BHPD dataset|
|BHP2| [Regression with a Dense Network (DNN) - Advanced code](BHPD/02-DNN-Regression-Premium.ipynb)<br>More advanced example of DNN network code - BHPD dataset| |BHP2| [Regression with a Dense Network (DNN) - Advanced code](BHPD/02-DNN-Regression-Premium.ipynb)<br>More advanced example of DNN network code - BHPD dataset|
|MNIST1| [Simple classification with DNN](MNIST/01-DNN-MNIST.ipynb)<br>Example of classification with a fully connected neural network| |MNIST1| [Simple classification with DNN](MNIST/01-DNN-MNIST.ipynb)<br>Example of classification with a fully connected neural network|
|GTS1| [CNN with GTSRB dataset - Data analysis and preparation](GTSRB/01-Preparation-of-data.ipynb)<br>Episode 1 : Data analysis and creation of a usable dataset| |GTS1| [CNN with GTSRB dataset - Data analysis and preparation](GTSRB/01-Preparation-of-data.ipynb)<br>Episode 1 : Data analysis and creation of a usable dataset|
|GTS2| [CNN with GTSRB dataset - First convolutions](GTSRB/02-First-convolutions.ipynb)<br>Episode 2 : First convolutions and first results| |GTS2| [CNN with GTSRB dataset - First convolutions](GTSRB/02-First-convolutions.ipynb)<br>Episode 2 : First convolutions and first results|
|GTS3| [CNN with GTSRB dataset - Monitoring ](GTSRB/03-Tracking-and-visualizing.ipynb)<br>Episode 3 : Monitoring and analysing training, managing checkpoints| |GTS3| [CNN with GTSRB dataset - Monitoring ](GTSRB/03-Tracking-and-visualizing.ipynb)<br>Episode 3 : Monitoring and analysing training, managing checkpoints|
|GTS4| [CNN with GTSRB dataset - Data augmentation ](GTSRB/04-Data-augmentation.ipynb)<br>Episode 4 : Improving the results with data augmentation| |GTS4| [CNN with GTSRB dataset - Data augmentation ](GTSRB/04-Data-augmentation.ipynb)<br>Episode 4 : Improving the results with data augmentation|
|GTS5| [CNN with GTSRB dataset - Full convolutions ](GTSRB/05-Full-convolutions.ipynb)<br>Episode 5 : A lot of models, a lot of datasets and a lot of results.| |GTS5| [CNN with GTSRB dataset - Full convolutions ](GTSRB/05-Full-convolutions.ipynb)<br>Episode 5 : A lot of models, a lot of datasets and a lot of results.|
|GTS6| [CNN with GTSRB dataset - Full convolutions as a batch](GTSRB/06-Notebook-as-a-batch.ipynb)<br>Episode 6 : Run Full convolution notebook as a batch| |GTS6| [CNN with GTSRB dataset - Full convolutions as a batch](GTSRB/06-Notebook-as-a-batch.ipynb)<br>Episode 6 : Run Full convolution notebook as a batch|
|GTS7| [CNN with GTSRB dataset - Show reports](GTSRB/07-Show-report.ipynb)<br>Episode 7 : Displaying the reports of the different jobs| |GTS7| [CNN with GTSRB dataset - Show reports](GTSRB/07-Show-report.ipynb)<br>Episode 7 : Displaying the reports of the different jobs|
|TSB1| [Tensorboard with/from Jupyter ](GTSRB/99-Scripts-Tensorboard.ipynb)<br>4 ways to use Tensorboard from the Jupyter environment| |TSB1| [Tensorboard with/from Jupyter ](GTSRB/99-Scripts-Tensorboard.ipynb)<br>4 ways to use Tensorboard from the Jupyter environment|
|IMDB1| [Text embedding with IMDB](IMDB/01-Embedding-Keras.ipynb)<br>A very classical example of word embedding for text classification (sentiment analysis)| |IMDB1| [Text embedding with IMDB](IMDB/01-Embedding-Keras.ipynb)<br>A very classical example of word embedding for text classification (sentiment analysis)|
|IMDB2| [Text embedding with IMDB - Reloaded](IMDB/02-Prediction.ipynb)<br>Example of reusing a previously saved model| |IMDB2| [Text embedding with IMDB - Reloaded](IMDB/02-Prediction.ipynb)<br>Example of reusing a previously saved model|
|IMDB3| [Text embedding/LSTM model with IMDB](IMDB/03-LSTM-Keras.ipynb)<br>Still the same problem, but with a network combining embedding and LSTM| |IMDB3| [Text embedding/LSTM model with IMDB](IMDB/03-LSTM-Keras.ipynb)<br>Still the same problem, but with a network combining embedding and LSTM|
|SYNOP1| [Time series with RNN - Preparation of data](SYNOP/01-Preparation-of-data.ipynb)<br>Episode 1 : Data analysis and creation of a usable dataset| |SYNOP1| [Time series with RNN - Preparation of data](SYNOP/01-Preparation-of-data.ipynb)<br>Episode 1 : Data analysis and creation of a usable dataset|
|SYNOP2| [Time series with RNN - Try a prediction](SYNOP/02-First-predictions.ipynb)<br>Episode 2 : Training session and first predictions| |SYNOP2| [Time series with RNN - Try a prediction](SYNOP/02-First-predictions.ipynb)<br>Episode 2 : Training session and first predictions|
|SYNOP3| [Time series with RNN - 12h predictions](SYNOP/03-12h-predictions.ipynb)<br>Episode 3: Attempt to predict in the longer term | |SYNOP3| [Time series with RNN - 12h predictions](SYNOP/03-12h-predictions.ipynb)<br>Episode 3: Attempt to predict in the longer term |
|VAE1| [Variational AutoEncoder (VAE) with MNIST](VAE/01-VAE-with-MNIST.nbconvert.ipynb)<br>Episode 1 : Model construction and Training| |VAE1| [Variational AutoEncoder (VAE) with MNIST](VAE/01-VAE-with-MNIST.nbconvert.ipynb)<br>Episode 1 : Model construction and Training|
|VAE2| [Variational AutoEncoder (VAE) with MNIST - Analysis](VAE/02-VAE-with-MNIST-post.ipynb)<br>Episode 2 : Exploring our latent space| |VAE2| [Variational AutoEncoder (VAE) with MNIST - Analysis](VAE/02-VAE-with-MNIST-post.ipynb)<br>Episode 2 : Exploring our latent space|
|VAE3| [About the CelebA dataset](VAE/03-About-CelebA.ipynb)<br>Episode 3 : About the CelebA dataset, a more fun dataset ;-)| |VAE3| [About the CelebA dataset](VAE/03-About-CelebA.ipynb)<br>Episode 3 : About the CelebA dataset, a more fun dataset ;-)|
|VAE4| [Preparation of the CelebA dataset](VAE/04-Prepare-CelebA-datasets.ipynb)<br>Episode 4 : Preparation of a clustered dataset, batchable| |VAE4| [Preparation of the CelebA dataset](VAE/04-Prepare-CelebA-datasets.ipynb)<br>Episode 4 : Preparation of a clustered dataset, batchable|
|VAE5| [Checking the clustered CelebA dataset](VAE/05-Check-CelebA.ipynb)<br>Episode 5 : Checking the clustered dataset| |VAE5| [Checking the clustered CelebA dataset](VAE/05-Check-CelebA.ipynb)<br>Episode 5 : Checking the clustered dataset|
|VAE6| [Variational AutoEncoder (VAE) with CelebA (small)](VAE/06-VAE-with-CelebA-s.nbconvert.ipynb)<br>Episode 6 : Variational AutoEncoder (VAE) with CelebA (small res.)| |VAE6| [Variational AutoEncoder (VAE) with CelebA (small)](VAE/06-VAE-with-CelebA-s.nbconvert.ipynb)<br>Episode 6 : Variational AutoEncoder (VAE) with CelebA (small res.)|
|VAE7| [Variational AutoEncoder (VAE) with CelebA (medium)](VAE/07-VAE-with-CelebA-m.nbconvert.ipynb)<br>Episode 7 : Variational AutoEncoder (VAE) with CelebA (medium res.)| |VAE7| [Variational AutoEncoder (VAE) with CelebA (medium)](VAE/07-VAE-with-CelebA-m.nbconvert.ipynb)<br>Episode 7 : Variational AutoEncoder (VAE) with CelebA (medium res.)|
|VAE8| [Variational AutoEncoder (VAE) with CelebA - Analysis](VAE/08-VAE-withCelebA-post.ipynb)<br>Episode 8 : Exploring latent space of our trained models| |VAE8| [Variational AutoEncoder (VAE) with CelebA - Analysis](VAE/08-VAE-withCelebA-post.ipynb)<br>Episode 8 : Exploring latent space of our trained models|
|ACTF1| [Activation functions](Misc/Activation-Functions.ipynb)<br>Some activation functions, with their derivatives.| |ACTF1| [Activation functions](Misc/Activation-Functions.ipynb)<br>Some activation functions, with their derivatives.|
|NP1| [A short introduction to Numpy](Misc/Numpy.ipynb)<br>Numpy is an essential tool for the Scientific Python.| |NP1| [A short introduction to Numpy](Misc/Numpy.ipynb)<br>Numpy is an essential tool for the Scientific Python.|
<!-- INDEX_END --> <!-- INDEX_END -->
## Installation ## Installation
A procedure for **configuring** and **starting Jupyter** is available in the **[Wiki](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Install-Fidle)**. A procedure for **configuring** and **starting Jupyter** is available in the **[Wiki](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Install-Fidle)**.
## Licence ## Licence
[<img width="100px" src="fidle/img/00-fidle-CC BY-NC-SA.svg"></img>](https://creativecommons.org/licenses/by-nc-sa/4.0/) [<img width="100px" src="fidle/img/00-fidle-CC BY-NC-SA.svg"></img>](https://creativecommons.org/licenses/by-nc-sa/4.0/)
\[en\] Attribution - NonCommercial - ShareAlike 4.0 International (CC BY-NC-SA 4.0) \[en\] Attribution - NonCommercial - ShareAlike 4.0 International (CC BY-NC-SA 4.0)
\[Fr\] Attribution - Pas d’Utilisation Commerciale - Partage dans les Mêmes Conditions 4.0 International \[Fr\] Attribution - Pas d’Utilisation Commerciale - Partage dans les Mêmes Conditions 4.0 International
See [License](https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode). See [License](https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode).
See [Disclaimer](https://creativecommons.org/licenses/by-nc-sa/4.0/#). See [Disclaimer](https://creativecommons.org/licenses/by-nc-sa/4.0/#).
---- ----
[<img width="80px" src="fidle/img/00-Fidle-logo-01.svg"></img>](#top) [<img width="80px" src="fidle/img/00-Fidle-logo-01.svg"></img>](#top)
......
...@@ -36,52 +36,80 @@ ...@@ -36,52 +36,80 @@
}, },
{ {
"cell_type": "code", "cell_type": "code",
"execution_count": 2, "execution_count": 3,
"metadata": {}, "metadata": {},
"outputs": [ "outputs": [
{ {
"data": { "data": {
"text/html": [ "text/html": [
"<style type=\"text/css\" >\n", "<style type=\"text/css\" >\n",
" #T_51149260_3ee0_11eb_ab9c_bf552d03deff td {\n", " #T_5accec2e_3f19_11eb_8e56_19607a97f796 td {\n",
" font-size: 110%;\n", " font-size: 110%;\n",
" text-align: left;\n", " text-align: left;\n",
" } #T_51149260_3ee0_11eb_ab9c_bf552d03deff th {\n", " } #T_5accec2e_3f19_11eb_8e56_19607a97f796 th {\n",
" font-size: 110%;\n", " font-size: 110%;\n",
" text-align: left;\n", " text-align: left;\n",
" }</style><table id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deff\" ><thead> <tr> <th class=\"col_heading level0 col0\" >id</th> <th class=\"col_heading level0 col1\" >name</th> <th class=\"col_heading level0 col2\" >start</th> <th class=\"col_heading level0 col3\" >end</th> <th class=\"col_heading level0 col4\" >duration</th> </tr></thead><tbody>\n", " }</style><table id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796\" ><thead> <tr> <th class=\"col_heading level0 col0\" >id</th> <th class=\"col_heading level0 col1\" >repo</th> <th class=\"col_heading level0 col2\" >name</th> <th class=\"col_heading level0 col3\" >start</th> <th class=\"col_heading level0 col4\" >end</th> <th class=\"col_heading level0 col5\" >duration</th> </tr></thead><tbody>\n",
" <tr>\n", " <tr>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow0_col0\" class=\"data row0 col0\" ><a href=\"../LinearReg/01-Linear-Regression.ipynb\">LINR1</a></td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row0_col0\" class=\"data row0 col0\" ><a href=\"../LinearReg/01-Linear-Regression.ipynb\">LINR1</a></td>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow0_col1\" class=\"data row0 col1\" ><a href=\"../LinearReg/01-Linear-Regression.ipynb\"><b>01-Linear-Regression.ipynb</b></a></td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row0_col1\" class=\"data row0 col1\" >LinearReg</td>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow0_col2\" class=\"data row0 col2\" >Tuesday 15 December 2020, 14:04:04</td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row0_col2\" class=\"data row0 col2\" ><a href=\"../LinearReg/01-Linear-Regression.ipynb\"><b>01-Linear-Regression.ipynb</b></a></td>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow0_col3\" class=\"data row0 col3\" >Tuesday 15 December 2020, 14:04:04</td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row0_col3\" class=\"data row0 col3\" >Tuesday 15 December 2020, 14:04:04</td>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow0_col4\" class=\"data row0 col4\" >00:00:00 295ms</td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row0_col4\" class=\"data row0 col4\" >Tuesday 15 December 2020, 14:04:04</td>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row0_col5\" class=\"data row0 col5\" >00:00:00 295ms</td>\n",
" </tr>\n", " </tr>\n",
" <tr>\n", " <tr>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow1_col0\" class=\"data row1 col0\" ><a href=\"../LinearReg/02-Gradient-descent.ipynb\">GRAD1</a></td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row1_col0\" class=\"data row1 col0\" ><a href=\"../LinearReg/02-Gradient-descent.ipynb\">GRAD1</a></td>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow1_col1\" class=\"data row1 col1\" ><a href=\"../LinearReg/02-Gradient-descent.ipynb\"><b>02-Gradient-descent.ipynb</b></a></td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row1_col1\" class=\"data row1 col1\" >LinearReg</td>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow1_col2\" class=\"data row1 col2\" >Tuesday 15 December 2020, 15:05:11</td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row1_col2\" class=\"data row1 col2\" ><a href=\"../LinearReg/02-Gradient-descent.ipynb\"><b>02-Gradient-descent.ipynb</b></a></td>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow1_col3\" class=\"data row1 col3\" >Tuesday 15 December 2020, 15:05:14</td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row1_col3\" class=\"data row1 col3\" >Tuesday 15 December 2020, 15:05:11</td>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow1_col4\" class=\"data row1 col4\" >00:00:03 120ms</td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row1_col4\" class=\"data row1 col4\" >Tuesday 15 December 2020, 15:05:14</td>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row1_col5\" class=\"data row1 col5\" >00:00:03 120ms</td>\n",
" </tr>\n", " </tr>\n",
" <tr>\n", " <tr>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow2_col0\" class=\"data row2 col0\" ><a href=\"../LinearReg/03-Polynomial-Regression.ipynb\">POLR1</a></td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row2_col0\" class=\"data row2 col0\" ><a href=\"../LinearReg/03-Polynomial-Regression.ipynb\">POLR1</a></td>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow2_col1\" class=\"data row2 col1\" ><a href=\"../LinearReg/03-Polynomial-Regression.ipynb\"><b>03-Polynomial-Regression.ipynb</b></a></td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row2_col1\" class=\"data row2 col1\" >LinearReg</td>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow2_col2\" class=\"data row2 col2\" >Tuesday 15 December 2020, 15:05:27</td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row2_col2\" class=\"data row2 col2\" ><a href=\"../LinearReg/03-Polynomial-Regression.ipynb\"><b>03-Polynomial-Regression.ipynb</b></a></td>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow2_col3\" class=\"data row2 col3\" >Tuesday 15 December 2020, 15:05:28</td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row2_col3\" class=\"data row2 col3\" >Tuesday 15 December 2020, 15:05:27</td>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow2_col4\" class=\"data row2 col4\" >00:00:01 686ms</td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row2_col4\" class=\"data row2 col4\" >Tuesday 15 December 2020, 15:05:28</td>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row2_col5\" class=\"data row2 col5\" >00:00:01 686ms</td>\n",
" </tr>\n", " </tr>\n",
" <tr>\n", " <tr>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow3_col0\" class=\"data row3 col0\" ><a href=\"../LinearReg/04-Logistic-Regression.ipynb\">LOGR1</a></td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row3_col0\" class=\"data row3 col0\" ><a href=\"../LinearReg/04-Logistic-Regression.ipynb\">LOGR1</a></td>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow3_col1\" class=\"data row3 col1\" ><a href=\"../LinearReg/04-Logistic-Regression.ipynb\"><b>04-Logistic-Regression.ipynb</b></a></td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row3_col1\" class=\"data row3 col1\" >LinearReg</td>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow3_col2\" class=\"data row3 col2\" >Tuesday 15 December 2020, 15:05:42</td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row3_col2\" class=\"data row3 col2\" ><a href=\"../LinearReg/04-Logistic-Regression.ipynb\"><b>04-Logistic-Regression.ipynb</b></a></td>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow3_col3\" class=\"data row3 col3\" >Tuesday 15 December 2020, 15:06:44</td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row3_col3\" class=\"data row3 col3\" >Tuesday 15 December 2020, 15:05:42</td>\n",
" <td id=\"T_51149260_3ee0_11eb_ab9c_bf552d03deffrow3_col4\" class=\"data row3 col4\" >00:01:02 112ms</td>\n", " <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row3_col4\" class=\"data row3 col4\" >Tuesday 15 December 2020, 15:06:44</td>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row3_col5\" class=\"data row3 col5\" >00:01:02 112ms</td>\n",
" </tr>\n",
" <tr>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row4_col0\" class=\"data row4 col0\" ><a href=\"../IRIS/01-Simple-Perceptron.ipynb\">PER57</a></td>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row4_col1\" class=\"data row4 col1\" >IRIS</td>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row4_col2\" class=\"data row4 col2\" ><a href=\"../IRIS/01-Simple-Perceptron.ipynb\"><b>01-Simple-Perceptron.ipynb</b></a></td>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row4_col3\" class=\"data row4 col3\" >Tuesday 15 December 2020, 21:49:41</td>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row4_col4\" class=\"data row4 col4\" >Tuesday 15 December 2020, 21:49:41</td>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row4_col5\" class=\"data row4 col5\" >00:00:00 203ms</td>\n",
" </tr>\n",
" <tr>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row5_col0\" class=\"data row5 col0\" ><a href=\"../BHPD/01-DNN-Regression.ipynb\">BHP1</a></td>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row5_col1\" class=\"data row5 col1\" >BHPD</td>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row5_col2\" class=\"data row5 col2\" ><a href=\"../BHPD/01-DNN-Regression.ipynb\"><b>01-DNN-Regression.ipynb</b></a></td>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row5_col3\" class=\"data row5 col3\" >Tuesday 15 December 2020, 21:51:22</td>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row5_col4\" class=\"data row5 col4\" >Tuesday 15 December 2020, 21:51:32</td>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row5_col5\" class=\"data row5 col5\" >00:00:10 080ms</td>\n",
" </tr>\n",
" <tr>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row6_col0\" class=\"data row6 col0\" ><a href=\"../BHPD/02-DNN-Regression-Premium.ipynb\">BHP2</a></td>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row6_col1\" class=\"data row6 col1\" >BHPD</td>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row6_col2\" class=\"data row6 col2\" ><a href=\"../BHPD/02-DNN-Regression-Premium.ipynb\"><b>02-DNN-Regression-Premium.ipynb</b></a></td>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row6_col3\" class=\"data row6 col3\" >Tuesday 15 December 2020, 22:05:15</td>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row6_col4\" class=\"data row6 col4\" >Tuesday 15 December 2020, 22:05:26</td>\n",
" <td id=\"T_5accec2e_3f19_11eb_8e56_19607a97f796row6_col5\" class=\"data row6 col5\" >00:00:11 601ms</td>\n",
" </tr>\n", " </tr>\n",
" </tbody></table>" " </tbody></table>"
], ],
"text/plain": [ "text/plain": [
"<pandas.io.formats.style.Styler at 0x7f6d1d0dd050>" "<pandas.io.formats.style.Styler at 0x7f4b4996e6d0>"
] ]
}, },
"metadata": {}, "metadata": {},
......
...@@ -129,6 +129,8 @@ def tag(tag, text, document): ...@@ -129,6 +129,8 @@ def tag(tag, text, document):
def get_ci_report(): def get_ci_report():
columns=['id','repo','name','start','end','duration']
# ---- Load catalog (notebooks descriptions) # ---- Load catalog (notebooks descriptions)
# #
with open(config.CATALOG_FILE) as fp: with open(config.CATALOG_FILE) as fp:
...@@ -140,7 +142,7 @@ def get_ci_report(): ...@@ -140,7 +142,7 @@ def get_ci_report():
dict_finished = json.load( infile ) dict_finished = json.load( infile )
if dict_finished == {}: if dict_finished == {}:
df=pd.DataFrame({}, columns=['id','name','start','end','duration']) df=pd.DataFrame({}, columns=columns)
else: else:
df=pd.DataFrame(dict_finished).transpose() df=pd.DataFrame(dict_finished).transpose()
df.reset_index(inplace=True) df.reset_index(inplace=True)
...@@ -148,7 +150,7 @@ def get_ci_report(): ...@@ -148,7 +150,7 @@ def get_ci_report():
# ---- Add usefull html columns # ---- Add usefull html columns
# #
df['name']='' df[ ['name','repo'] ]=''
for index, row in df.iterrows(): for index, row in df.iterrows():
id = row['id'] id = row['id']
...@@ -158,8 +160,8 @@ def get_ci_report(): ...@@ -158,8 +160,8 @@ def get_ci_report():
description = catalog[id]['description'] description = catalog[id]['description']
row['id'] = f'<a href="../{dirname}/{basename}">{id}</a>' row['id'] = f'<a href="../{dirname}/{basename}">{id}</a>'
row['name'] = f'<a href="../{dirname}/{basename}"><b>{basename}</b></a>' row['name'] = f'<a href="../{dirname}/{basename}"><b>{basename}</b></a>'
row['repo'] = dirname
columns=['id','name','start','end','duration']
df=df[columns] df=df[columns]
# ---- Add styles to be nice # ---- Add styles to be nice
...@@ -171,8 +173,6 @@ def get_ci_report(): ...@@ -171,8 +173,6 @@ def get_ci_report():
def still_pending(v): def still_pending(v):
return 'background-color: OrangeRed; color:white' if v == 'Unfinished...' else '' return 'background-color: OrangeRed; color:white' if v == 'Unfinished...' else ''
columns=['id','name','start','end','duration']
output = df[columns].style.set_table_styles(styles).hide_index().applymap(still_pending) output = df[columns].style.set_table_styles(styles).hide_index().applymap(still_pending)
# ---- Get mail report # ---- Get mail report
......
...@@ -24,43 +24,71 @@ ...@@ -24,43 +24,71 @@
<body> <body>
<br>Hi, <br>Hi,
<p>Below is the result of the continuous integration tests of the Fidle project:</p> <p>Below is the result of the continuous integration tests of the Fidle project:</p>
<div class="header"><b>Report date :</b> Tuesday 15 December 2020, 15:17:51</div> <div class="header"><b>Report date :</b> Tuesday 15 December 2020, 22:06:09</div>
<div class="result"> <div class="result">
<style type="text/css" > <style type="text/css" >
#T_51134338_3ee0_11eb_ab9c_bf552d03deff td { #T_5acb85fa_3f19_11eb_8e56_19607a97f796 td {
font-size: 110%; font-size: 110%;
text-align: left; text-align: left;
} #T_51134338_3ee0_11eb_ab9c_bf552d03deff th { } #T_5acb85fa_3f19_11eb_8e56_19607a97f796 th {
font-size: 110%; font-size: 110%;
text-align: left; text-align: left;
}</style><table id="T_51134338_3ee0_11eb_ab9c_bf552d03deff" ><thead> <tr> <th class="col_heading level0 col0" >id</th> <th class="col_heading level0 col1" >name</th> <th class="col_heading level0 col2" >start</th> <th class="col_heading level0 col3" >end</th> <th class="col_heading level0 col4" >duration</th> </tr></thead><tbody> }</style><table id="T_5acb85fa_3f19_11eb_8e56_19607a97f796" ><thead> <tr> <th class="col_heading level0 col0" >id</th> <th class="col_heading level0 col1" >repo</th> <th class="col_heading level0 col2" >name</th> <th class="col_heading level0 col3" >start</th> <th class="col_heading level0 col4" >end</th> <th class="col_heading level0 col5" >duration</th> </tr></thead><tbody>
<tr> <tr>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow0_col0" class="data row0 col0" ><a href="../LinearReg/01-Linear-Regression.ipynb">LINR1</a></td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row0_col0" class="data row0 col0" ><a href="../LinearReg/01-Linear-Regression.ipynb">LINR1</a></td>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow0_col1" class="data row0 col1" ><a href="../LinearReg/01-Linear-Regression.ipynb"><b>01-Linear-Regression.ipynb</b></a></td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row0_col1" class="data row0 col1" >LinearReg</td>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow0_col2" class="data row0 col2" >Tuesday 15 December 2020, 14:04:04</td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row0_col2" class="data row0 col2" ><a href="../LinearReg/01-Linear-Regression.ipynb"><b>01-Linear-Regression.ipynb</b></a></td>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow0_col3" class="data row0 col3" >Tuesday 15 December 2020, 14:04:04</td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row0_col3" class="data row0 col3" >Tuesday 15 December 2020, 14:04:04</td>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow0_col4" class="data row0 col4" >00:00:00 295ms</td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row0_col4" class="data row0 col4" >Tuesday 15 December 2020, 14:04:04</td>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row0_col5" class="data row0 col5" >00:00:00 295ms</td>
</tr> </tr>
<tr> <tr>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow1_col0" class="data row1 col0" ><a href="../LinearReg/02-Gradient-descent.ipynb">GRAD1</a></td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row1_col0" class="data row1 col0" ><a href="../LinearReg/02-Gradient-descent.ipynb">GRAD1</a></td>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow1_col1" class="data row1 col1" ><a href="../LinearReg/02-Gradient-descent.ipynb"><b>02-Gradient-descent.ipynb</b></a></td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row1_col1" class="data row1 col1" >LinearReg</td>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow1_col2" class="data row1 col2" >Tuesday 15 December 2020, 15:05:11</td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row1_col2" class="data row1 col2" ><a href="../LinearReg/02-Gradient-descent.ipynb"><b>02-Gradient-descent.ipynb</b></a></td>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow1_col3" class="data row1 col3" >Tuesday 15 December 2020, 15:05:14</td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row1_col3" class="data row1 col3" >Tuesday 15 December 2020, 15:05:11</td>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow1_col4" class="data row1 col4" >00:00:03 120ms</td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row1_col4" class="data row1 col4" >Tuesday 15 December 2020, 15:05:14</td>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row1_col5" class="data row1 col5" >00:00:03 120ms</td>
</tr> </tr>
<tr> <tr>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow2_col0" class="data row2 col0" ><a href="../LinearReg/03-Polynomial-Regression.ipynb">POLR1</a></td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row2_col0" class="data row2 col0" ><a href="../LinearReg/03-Polynomial-Regression.ipynb">POLR1</a></td>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow2_col1" class="data row2 col1" ><a href="../LinearReg/03-Polynomial-Regression.ipynb"><b>03-Polynomial-Regression.ipynb</b></a></td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row2_col1" class="data row2 col1" >LinearReg</td>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow2_col2" class="data row2 col2" >Tuesday 15 December 2020, 15:05:27</td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row2_col2" class="data row2 col2" ><a href="../LinearReg/03-Polynomial-Regression.ipynb"><b>03-Polynomial-Regression.ipynb</b></a></td>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow2_col3" class="data row2 col3" >Tuesday 15 December 2020, 15:05:28</td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row2_col3" class="data row2 col3" >Tuesday 15 December 2020, 15:05:27</td>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow2_col4" class="data row2 col4" >00:00:01 686ms</td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row2_col4" class="data row2 col4" >Tuesday 15 December 2020, 15:05:28</td>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row2_col5" class="data row2 col5" >00:00:01 686ms</td>
</tr> </tr>
<tr> <tr>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow3_col0" class="data row3 col0" ><a href="../LinearReg/04-Logistic-Regression.ipynb">LOGR1</a></td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row3_col0" class="data row3 col0" ><a href="../LinearReg/04-Logistic-Regression.ipynb">LOGR1</a></td>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow3_col1" class="data row3 col1" ><a href="../LinearReg/04-Logistic-Regression.ipynb"><b>04-Logistic-Regression.ipynb</b></a></td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row3_col1" class="data row3 col1" >LinearReg</td>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow3_col2" class="data row3 col2" >Tuesday 15 December 2020, 15:05:42</td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row3_col2" class="data row3 col2" ><a href="../LinearReg/04-Logistic-Regression.ipynb"><b>04-Logistic-Regression.ipynb</b></a></td>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow3_col3" class="data row3 col3" >Tuesday 15 December 2020, 15:06:44</td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row3_col3" class="data row3 col3" >Tuesday 15 December 2020, 15:05:42</td>
<td id="T_51134338_3ee0_11eb_ab9c_bf552d03deffrow3_col4" class="data row3 col4" >00:01:02 112ms</td> <td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row3_col4" class="data row3 col4" >Tuesday 15 December 2020, 15:06:44</td>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row3_col5" class="data row3 col5" >00:01:02 112ms</td>
</tr>
<tr>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row4_col0" class="data row4 col0" ><a href="../IRIS/01-Simple-Perceptron.ipynb">PER57</a></td>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row4_col1" class="data row4 col1" >IRIS</td>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row4_col2" class="data row4 col2" ><a href="../IRIS/01-Simple-Perceptron.ipynb"><b>01-Simple-Perceptron.ipynb</b></a></td>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row4_col3" class="data row4 col3" >Tuesday 15 December 2020, 21:49:41</td>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row4_col4" class="data row4 col4" >Tuesday 15 December 2020, 21:49:41</td>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row4_col5" class="data row4 col5" >00:00:00 203ms</td>
</tr>
<tr>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row5_col0" class="data row5 col0" ><a href="../BHPD/01-DNN-Regression.ipynb">BHP1</a></td>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row5_col1" class="data row5 col1" >BHPD</td>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row5_col2" class="data row5 col2" ><a href="../BHPD/01-DNN-Regression.ipynb"><b>01-DNN-Regression.ipynb</b></a></td>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row5_col3" class="data row5 col3" >Tuesday 15 December 2020, 21:51:22</td>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row5_col4" class="data row5 col4" >Tuesday 15 December 2020, 21:51:32</td>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row5_col5" class="data row5 col5" >00:00:10 080ms</td>
</tr>
<tr>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row6_col0" class="data row6 col0" ><a href="../BHPD/02-DNN-Regression-Premium.ipynb">BHP2</a></td>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row6_col1" class="data row6 col1" >BHPD</td>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row6_col2" class="data row6 col2" ><a href="../BHPD/02-DNN-Regression-Premium.ipynb"><b>02-DNN-Regression-Premium.ipynb</b></a></td>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row6_col3" class="data row6 col3" >Tuesday 15 December 2020, 22:05:15</td>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row6_col4" class="data row6 col4" >Tuesday 15 December 2020, 22:05:26</td>
<td id="T_5acb85fa_3f19_11eb_8e56_19607a97f796row6_col5" class="data row6 col5" >00:00:11 601ms</td>
</tr> </tr>
</tbody></table> </tbody></table>
</div> </div>
......
...@@ -22,5 +22,23 @@ ...@@ -22,5 +22,23 @@
"start": "Tuesday 15 December 2020, 15:05:42", "start": "Tuesday 15 December 2020, 15:05:42",
"end": "Tuesday 15 December 2020, 15:06:44", "end": "Tuesday 15 December 2020, 15:06:44",
"duration": "00:01:02 112ms" "duration": "00:01:02 112ms"
},
"PER57": {
"path": "/home/pjluc/dev/fidle/IRIS",
"start": "Tuesday 15 December 2020, 21:49:41",
"end": "Tuesday 15 December 2020, 21:49:41",
"duration": "00:00:00 203ms"
},
"BHP1": {
"path": "/home/pjluc/dev/fidle/BHPD",
"start": "Tuesday 15 December 2020, 21:51:22",
"end": "Tuesday 15 December 2020, 21:51:32",
"duration": "00:00:10 080ms"
},
"BHP2": {
"path": "/home/pjluc/dev/fidle/BHPD",
"start": "Tuesday 15 December 2020, 22:05:15",
"end": "Tuesday 15 December 2020, 22:05:26",
"duration": "00:00:11 601ms"
} }
} }
\ No newline at end of file
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment