Skip to content
Snippets Groups Projects
Commit 52087d40 authored by Jean-Luc Parouty's avatar Jean-Luc Parouty
Browse files

Update README generator

parent 4c6e3cb6
No related branches found
No related tags found
No related merge requests found
%% Cell type:code id: tags:
``` python
from IPython.display import display,Markdown
display(Markdown(open('README.md', 'r').read()))
#
# This README is visible under Jupiter LAb ! :-)
```
%% Output
<a name="top"></a>
[<img width="600px" src="fidle/img/00-Fidle-titre-01.svg"></img>](#top)
<!-- --------------------------------------------------- -->
<!-- To correctly view this README under Jupyter Lab -->
<!-- Open the notebook: README.ipynb! -->
<!-- --------------------------------------------------- -->
## A propos
This repository contains all the documents and links of the **Fidle Training** .
Fidle (for Formation Introduction au Deep Learning) is a 2-day training session
co-organized by the Formation Permanente CNRS and the SARI and DEVLOG networks.
The objectives of this training are :
- Understanding the **bases of Deep Learning** neural networks
- Develop a **first experience** through simple and representative examples
- Understanding **Tensorflow/Keras** and **Jupyter lab** technologies
- Apprehend the **academic computing environments** Tier-2 or Tier-1 with powerfull GPU
For more information, you can contact us at :
[<img width="200px" style="vertical-align:middle" src="fidle/img/00-Mail_contact.svg"></img>](#top)
Current Version : <!-- VERSION_BEGIN -->
0.6.0 DEV
0.6.1 DEV
<!-- VERSION_END -->
## Course materials
| | | |
|:--:|:--:|:--:|
| **[<img width="50px" src="fidle/img/00-Fidle-pdf.svg"></img><br>Course slides](https://cloud.univ-grenoble-alpes.fr/index.php/s/wxCztjYBbQ6zwd6)**<br>The course in pdf format<br>(12 Mo)| **[<img width="50px" src="fidle/img/00-Notebooks.svg"></img><br>Notebooks](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/archive/master/fidle-master.zip)**<br> &nbsp;&nbsp;&nbsp;&nbsp;Get a Zip or clone this repository &nbsp;&nbsp;&nbsp;&nbsp;<br>(10 Mo)| **[<img width="50px" src="fidle/img/00-Datasets-tar.svg"></img><br>Datasets](https://cloud.univ-grenoble-alpes.fr/index.php/s/wxCztjYBbQ6zwd6)**<br>All the needed datasets<br>(1.2 Go)|
Have a look about **[How to get and install](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Install-Fidle)** these notebooks and datasets.
## Jupyter notebooks
<!-- INDEX_BEGIN -->
| | |
|--|--|
|LINR1| [Linear regression with direct resolution](LinearReg/01-Linear-Regression.ipynb)<br>Direct determination of linear regression |
|GRAD1| [Linear regression with gradient descent](LinearReg/02-Gradient-descent.ipynb)<br>An example of gradient descent in the simple case of a linear regression.|
|POLR1| [Complexity Syndrome](LinearReg/03-Polynomial-Regression.ipynb)<br>Illustration of the problem of complexity with the polynomial regression|
|LOGR1| [Logistic regression, in pure Tensorflow](LinearReg/04-Logistic-Regression.ipynb)<br>Logistic Regression with Mini-Batch Gradient Descent using pure TensorFlow. |
|PER57| [Perceptron Model 1957](IRIS/01-Simple-Perceptron.ipynb)<br>A simple perceptron, with the IRIS dataset.|
|BHP1| [Regression with a Dense Network (DNN)](BHPD/01-DNN-Regression.ipynb)<br>A Simple regression with a Dense Neural Network (DNN) - BHPD dataset|
|BHP2| [Regression with a Dense Network (DNN) - Advanced code](BHPD/02-DNN-Regression-Premium.ipynb)<br>More advanced example of DNN network code - BHPD dataset|
|MNIST1| [Simple classification with DNN](MNIST/01-DNN-MNIST.ipynb)<br>Example of classification with a fully connected neural network|
|GTS1| [CNN with GTSRB dataset - Data analysis and preparation](GTSRB/01-Preparation-of-data.ipynb)<br>Episode 1 : Data analysis and creation of a usable dataset|
|GTS2| [CNN with GTSRB dataset - First convolutions](GTSRB/02-First-convolutions.ipynb)<br>Episode 2 : First convolutions and first results|
|GTS3| [CNN with GTSRB dataset - Monitoring ](GTSRB/03-Tracking-and-visualizing.ipynb)<br>Episode 3 : Monitoring and analysing training, managing checkpoints|
|GTS4| [CNN with GTSRB dataset - Data augmentation ](GTSRB/04-Data-augmentation.ipynb)<br>Episode 4 : Improving the results with data augmentation|
|GTS5| [CNN with GTSRB dataset - Full convolutions ](GTSRB/05-Full-convolutions.ipynb)<br>Episode 5 : A lot of models, a lot of datasets and a lot of results.|
|GTS6| [CNN with GTSRB dataset - Full convolutions as a batch](GTSRB/06-Notebook-as-a-batch.ipynb)<br>Episode 6 : Run Full convolution notebook as a batch|
|GTS7| [CNN with GTSRB dataset - Show reports](GTSRB/07-Show-report.ipynb)<br>Episode 7 : Displaying the reports of the different jobs|
|TSB1| [Tensorboard with/from Jupyter ](GTSRB/99-Scripts-Tensorboard.ipynb)<br>4 ways to use Tensorboard from the Jupyter environment|
|IMDB1| [Text embedding with IMDB](IMDB/01-Embedding-Keras.ipynb)<br>A very classical example of word embedding for text classification (sentiment analysis)|
|IMDB2| [Text embedding with IMDB - Reloaded](IMDB/02-Prediction.ipynb)<br>Example of reusing a previously saved model|
|IMDB3| [Text embedding/LSTM model with IMDB](IMDB/03-LSTM-Keras.ipynb)<br>Still the same problem, but with a network combining embedding and LSTM|
|SYNOP1| [Time series with RNN - Preparation of data](SYNOP/01-Preparation-of-data.ipynb)<br>Episode 1 : Data analysis and creation of a usable dataset|
|SYNOP2| [Time series with RNN - Try a prediction](SYNOP/02-First-predictions.ipynb)<br>Episode 2 : Training session and first predictions|
|SYNOP3| [Time series with RNN - 12h predictions](SYNOP/03-12h-predictions.ipynb)<br>Episode 3: Attempt to predict in the longer term |
|VAE1| [Variational AutoEncoder (VAE) with MNIST](VAE/01-VAE-with-MNIST.nbconvert.ipynb)<br>Episode 1 : Model construction and Training|
|VAE2| [Variational AutoEncoder (VAE) with MNIST - Analysis](VAE/02-VAE-with-MNIST-post.ipynb)<br>Episode 2 : Exploring our latent space|
|VAE3| [About the CelebA dataset](VAE/03-About-CelebA.ipynb)<br>Episode 3 : About the CelebA dataset, a more fun dataset ;-)|
|VAE4| [Preparation of the CelebA dataset](VAE/04-Prepare-CelebA-datasets.ipynb)<br>Episode 4 : Preparation of a clustered dataset, batchable|
|VAE5| [Checking the clustered CelebA dataset](VAE/05-Check-CelebA.ipynb)<br>Episode 5 : Checking the clustered dataset|
|VAE6| [Variational AutoEncoder (VAE) with CelebA (small)](VAE/06-VAE-with-CelebA-s.nbconvert.ipynb)<br>Episode 6 : Variational AutoEncoder (VAE) with CelebA (small res.)|
|VAE7| [Variational AutoEncoder (VAE) with CelebA (medium)](VAE/07-VAE-with-CelebA-m.nbconvert.ipynb)<br>Episode 7 : Variational AutoEncoder (VAE) with CelebA (medium res.)|
|VAE8| [Variational AutoEncoder (VAE) with CelebA - Analysis](VAE/08-VAE-withCelebA-post.ipynb)<br>Episode 8 : Exploring latent space of our trained models|
|ACTF1| [Activation functions](Misc/Activation-Functions.ipynb)<br>Some activation functions, with their derivatives.|
|NP1| [A short introduction to Numpy](Misc/Numpy.ipynb)<br>Numpy is an essential tool for the Scientific Python.|
<!-- INDEX_END -->
## Installation
A procedure for **configuring** and **starting Jupyter** is available in the **[Wiki](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Install-Fidle)**.
## Licence
[<img width="100px" src="fidle/img/00-fidle-CC BY-NC-SA.svg"></img>](https://creativecommons.org/licenses/by-nc-sa/4.0/)
\[en\] Attribution - NonCommercial - ShareAlike 4.0 International (CC BY-NC-SA 4.0)
\[Fr\] Attribution - Pas d’Utilisation Commerciale - Partage dans les Mêmes Conditions 4.0 International
See [License](https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode).
See [Disclaimer](https://creativecommons.org/licenses/by-nc-sa/4.0/#).
----
[<img width="80px" src="fidle/img/00-Fidle-logo-01.svg"></img>](#top)
%% Cell type:code id: tags:
``` python
```
......
......@@ -23,7 +23,7 @@ The objectives of this training are :
For more information, you can contact us at :
[<img width="200px" style="vertical-align:middle" src="fidle/img/00-Mail_contact.svg"></img>](#top)
Current Version : <!-- VERSION_BEGIN -->
0.6.0 DEV
0.6.1 DEV
<!-- VERSION_END -->
......
%% Cell type:markdown id: tags:
<a name="top"></a>
[<img width="600px" src="fidle/img/00-Fidle-titre-01.svg"></img>](#top)
<!-- --------------------------------------------------- -->
<!-- To correctly view this README under Jupyter Lab -->
<!-- Open the notebook: README.ipynb! -->
<!-- --------------------------------------------------- -->
## A propos
This repository contains all the documents and links of the **Fidle Training** .
Fidle (for Formation Introduction au Deep Learning) is a 2-day training session
co-organized by the Formation Permanente CNRS and the SARI and DEVLOG networks.
The objectives of this training are :
- Understanding the **bases of Deep Learning** neural networks
- Develop a **first experience** through simple and representative examples
- Understanding **Tensorflow/Keras** and **Jupyter lab** technologies
- Apprehend the **academic computing environments** Tier-2 or Tier-1 with powerfull GPU
For more information, you can contact us at :
[<img width="200px" style="vertical-align:middle" src="fidle/img/00-Mail_contact.svg"></img>](#top)
Current Version : <!-- VERSION_BEGIN -->
0.6.0 DEV
<!-- VERSION_END -->
## Course materials
| | | |
|:--:|:--:|:--:|
| **[<img width="50px" src="fidle/img/00-Fidle-pdf.svg"></img><br>Course slides](https://cloud.univ-grenoble-alpes.fr/index.php/s/wxCztjYBbQ6zwd6)**<br>The course in pdf format<br>(12 Mo)| **[<img width="50px" src="fidle/img/00-Notebooks.svg"></img><br>Notebooks](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/archive/master/fidle-master.zip)**<br> &nbsp;&nbsp;&nbsp;&nbsp;Get a Zip or clone this repository &nbsp;&nbsp;&nbsp;&nbsp;<br>(10 Mo)| **[<img width="50px" src="fidle/img/00-Datasets-tar.svg"></img><br>Datasets](https://cloud.univ-grenoble-alpes.fr/index.php/s/wxCztjYBbQ6zwd6)**<br>All the needed datasets<br>(1.2 Go)|
Have a look about **[How to get and install](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Install-Fidle)** these notebooks and datasets.
## Jupyter notebooks
<!-- INDEX_BEGIN -->
| | |
|--|--|
|LINR1| [Linear regression with direct resolution](LinearReg/01-Linear-Regression.ipynb)<br>Direct determination of linear regression |
|GRAD1| [Linear regression with gradient descent](LinearReg/02-Gradient-descent.ipynb)<br>An example of gradient descent in the simple case of a linear regression.|
|POLR1| [Complexity Syndrome](LinearReg/03-Polynomial-Regression.ipynb)<br>Illustration of the problem of complexity with the polynomial regression|
|LOGR1| [Logistic regression, in pure Tensorflow](LinearReg/04-Logistic-Regression.ipynb)<br>Logistic Regression with Mini-Batch Gradient Descent using pure TensorFlow. |
|PER57| [Perceptron Model 1957](IRIS/01-Simple-Perceptron.ipynb)<br>A simple perceptron, with the IRIS dataset.|
|BHP1| [Regression with a Dense Network (DNN)](BHPD/01-DNN-Regression.ipynb)<br>A Simple regression with a Dense Neural Network (DNN) - BHPD dataset|
|BHP2| [Regression with a Dense Network (DNN) - Advanced code](BHPD/02-DNN-Regression-Premium.ipynb)<br>More advanced example of DNN network code - BHPD dataset|
|MNIST1| [Simple classification with DNN](MNIST/01-DNN-MNIST.ipynb)<br>Example of classification with a fully connected neural network|
|GTS1| [CNN with GTSRB dataset - Data analysis and preparation](GTSRB/01-Preparation-of-data.ipynb)<br>Episode 1 : Data analysis and creation of a usable dataset|
|GTS2| [CNN with GTSRB dataset - First convolutions](GTSRB/02-First-convolutions.ipynb)<br>Episode 2 : First convolutions and first results|
|GTS3| [CNN with GTSRB dataset - Monitoring ](GTSRB/03-Tracking-and-visualizing.ipynb)<br>Episode 3 : Monitoring and analysing training, managing checkpoints|
|GTS4| [CNN with GTSRB dataset - Data augmentation ](GTSRB/04-Data-augmentation.ipynb)<br>Episode 4 : Improving the results with data augmentation|
|GTS5| [CNN with GTSRB dataset - Full convolutions ](GTSRB/05-Full-convolutions.ipynb)<br>Episode 5 : A lot of models, a lot of datasets and a lot of results.|
|GTS6| [CNN with GTSRB dataset - Full convolutions as a batch](GTSRB/06-Notebook-as-a-batch.ipynb)<br>Episode 6 : Run Full convolution notebook as a batch|
|GTS7| [CNN with GTSRB dataset - Show reports](GTSRB/07-Show-report.ipynb)<br>Episode 7 : Displaying the reports of the different jobs|
|TSB1| [Tensorboard with/from Jupyter ](GTSRB/99-Scripts-Tensorboard.ipynb)<br>4 ways to use Tensorboard from the Jupyter environment|
|IMDB1| [Text embedding with IMDB](IMDB/01-Embedding-Keras.ipynb)<br>A very classical example of word embedding for text classification (sentiment analysis)|
|IMDB2| [Text embedding with IMDB - Reloaded](IMDB/02-Prediction.ipynb)<br>Example of reusing a previously saved model|
|IMDB3| [Text embedding/LSTM model with IMDB](IMDB/03-LSTM-Keras.ipynb)<br>Still the same problem, but with a network combining embedding and LSTM|
|SYNOP1| [Time series with RNN - Preparation of data](SYNOP/01-Preparation-of-data.ipynb)<br>Episode 1 : Data analysis and creation of a usable dataset|
|SYNOP2| [Time series with RNN - Try a prediction](SYNOP/02-First-predictions.ipynb)<br>Episode 2 : Training session and first predictions|
|SYNOP3| [Time series with RNN - 12h predictions](SYNOP/03-12h-predictions.ipynb)<br>Episode 3: Attempt to predict in the longer term |
|VAE1| [Variational AutoEncoder (VAE) with MNIST](VAE/01-VAE-with-MNIST.nbconvert.ipynb)<br>Episode 1 : Model construction and Training|
|VAE2| [Variational AutoEncoder (VAE) with MNIST - Analysis](VAE/02-VAE-with-MNIST-post.ipynb)<br>Episode 2 : Exploring our latent space|
|VAE3| [About the CelebA dataset](VAE/03-About-CelebA.ipynb)<br>Episode 3 : About the CelebA dataset, a more fun dataset ;-)|
|VAE4| [Preparation of the CelebA dataset](VAE/04-Prepare-CelebA-datasets.ipynb)<br>Episode 4 : Preparation of a clustered dataset, batchable|
|VAE5| [Checking the clustered CelebA dataset](VAE/05-Check-CelebA.ipynb)<br>Episode 5 : Checking the clustered dataset|
|VAE6| [Variational AutoEncoder (VAE) with CelebA (small)](VAE/06-VAE-with-CelebA-s.nbconvert.ipynb)<br>Episode 6 : Variational AutoEncoder (VAE) with CelebA (small res.)|
|VAE7| [Variational AutoEncoder (VAE) with CelebA (medium)](VAE/07-VAE-with-CelebA-m.nbconvert.ipynb)<br>Episode 7 : Variational AutoEncoder (VAE) with CelebA (medium res.)|
|VAE8| [Variational AutoEncoder (VAE) with CelebA - Analysis](VAE/08-VAE-withCelebA-post.ipynb)<br>Episode 8 : Exploring latent space of our trained models|
|ACTF1| [Activation functions](Misc/Activation-Functions.ipynb)<br>Some activation functions, with their derivatives.|
|NP1| [A short introduction to Numpy](Misc/Numpy.ipynb)<br>Numpy is an essential tool for the Scientific Python.|
<!-- INDEX_END -->
## Installation
A procedure for **configuring** and **starting Jupyter** is available in the **[Wiki](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Install-Fidle)**.
## Licence
[<img width="100px" src="fidle/img/00-fidle-CC BY-NC-SA.svg"></img>](https://creativecommons.org/licenses/by-nc-sa/4.0/)
\[en\] Attribution - NonCommercial - ShareAlike 4.0 International (CC BY-NC-SA 4.0)
\[Fr\] Attribution - Pas d’Utilisation Commerciale - Partage dans les Mêmes Conditions 4.0 International
See [License](https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode).
See [Disclaimer](https://creativecommons.org/licenses/by-nc-sa/4.0/#).
----
[<img width="80px" src="fidle/img/00-Fidle-logo-01.svg"></img>](#top)
%% Cell type:markdown id: tags:
<img width="800px" src="../fidle/img/00-Fidle-header-01.svg"></img>
## Mise a jour du catalog des notebooks et des READMEs
- Génération du catalog des notebooks : `./log/catalog_nb.json`
- Génération automatique de la table des matières
- Mise à jour des `README.md` et `REAME.ipynb`
%% Cell type:markdown id: tags:
## Step 1 - Load modules
%% Cell type:code id: tags:
``` python
import nbformat
from nbconvert.preprocessors import ExecutePreprocessor
import re
import sys, os, glob
import json
from collections import OrderedDict
sys.path.append('..')
# import fidle.pwk as pwk
import fidle.config as config
import fidle.catalog_builder as builder
```
%% Cell type:code id: tags:
%% Cell type:markdown id: tags:
``` python
directories_to_index = ['LinearReg', 'IRIS', 'BHPD', 'MNIST', 'GTSRB', 'IMDB', 'SYNOP', 'VAE', 'Misc']
```
## Step 2 - List of folders containing notebooks to be indexed :
%% Cell type:code id: tags:
``` python
directories_to_index = ['LinearReg', 'IRIS', 'BHPD', 'MNIST', 'GTSRB', 'IMDB', 'SYNOP', 'VAE', 'Misc']
```
def get_notebooks(directories, top_dir='..'):
'''
Return a list of notebooks from a given list of directories
args:
directories : list of directories
top_dir : location of theses directories
return:
notebooks : notebooks filename list (without top_dir prefix)
'''
notebooks = []
for d in directories:
filenames = glob.glob( f'{top_dir}/{d}/*.ipynb')
filenames.sort()
notebooks.extend(filenames)
notebooks = [ x.replace(f'{top_dir}/','') for x in notebooks]
return notebooks
def get_infos(filename, top_dir='..'):
'''
Extract informations from a fidle notebook.
Informations are dirname, basename, id, title, description and are extracted from comments tags in markdown.
args:
filename : Notebook filename
return:
dict : with infos.
'''
about={}
about['dirname'] = os.path.dirname(filename)
about['basename'] = os.path.basename(filename)
about['id'] = '??'
about['title'] = '??'
about['description'] = '??'
# ---- Read notebook
#
notebook = nbformat.read(f'{top_dir}/{filename}', nbformat.NO_CONVERT)
# ---- Get id, title and desc tags
#
for cell in notebook.cells:
if cell['cell_type'] == 'markdown':
find = re.findall(r'<\!-- TITLE -->\s*\[(.*)\]\s*-\s*(.*)\n',cell.source)
if find:
about['id'] = find[0][0]
about['title'] = find[0][1]
find = re.findall(r'<\!-- DESC -->\s*(.*)\n',cell.source)
if find:
about['description'] = find[0]
return about
def get_catalog(notebooks_list, top_dir='..'):
'''
Return an OrderedDict of notebooks attributes.
Keys are notebooks id.
args:
notebooks_list : list of notebooks filenames
top_dir : Location of theses notebooks
return:
OrderedDict : {<notebook id> : { description} }
'''
catalog = OrderedDict()
for nb in notebooks_list:
about = get_infos(nb, top_dir='..')
id=about['id']
catalog[id] = about
return catalog
%% Cell type:markdown id: tags:
```
## Step 3 - Catalog of notebooks
%% Cell type:code id: tags:
``` python
# ---- Get the notebook list
#
notebooks_list = get_notebooks(directories_to_index)
notebooks_list = builder.get_notebooks(directories_to_index)
# ---- Get a detailled catalog for this list
#
catalog = get_catalog(notebooks_list)
catalog = builder.get_catalog(notebooks_list)
with open(config.CATALOG_FILE,'wt') as fp:
json.dump(catalog,fp,indent=4)
json.dump(catalog,fp,indent=4)
print(f'Catalog saved as {config.CATALOG_FILE}')
```
%% Output
Catalog saved as ../fidle/log/catalog_nb.json
%% Cell type:markdown id: tags:
## Step 4 - README.md
%% Cell type:code id: tags:
``` python
# ---- Create a markdown index
#
lines=['| | |','|--|--|']
tab='&nbsp;'*5
for id, about in catalog.items():
id = about['id']
dirname = about['dirname']
basename = about['basename']
title = about['title']
description = about['description']
# lines.append( f'[[{id}] - {title}]({dirname}/{basename}) ' )
# lines.append( f'{tab}{description} ')
lines.append( f'|{id}| [{title}]({dirname}/{basename})<br>{description}|')
index = '\n'.join(lines)
# ---- Load README.md
#
with open('../README.md','r') as fp:
readme=fp.read()
# ---- Update index
# ---- Update index, version
#
debut = '<!-- INDEX_BEGIN -->'
fin = '<!-- INDEX_END -->'
readme = re.sub(f'{debut}.*{fin}',f'{debut}\n{index}\n{fin}',readme, flags=re.DOTALL)
# ---- Update version
#
debut = '<!-- VERSION_BEGIN -->'
fin = '<!-- VERSION_END -->'
readme = re.sub(f'{debut}.*{fin}',f'{debut}\n{config.VERSION}\n{fin}',readme, flags=re.DOTALL)
readme = builder.tag('INDEX', index, readme)
readme = builder.tag('VERSION', config.VERSION, readme)
# ---- Save it
#
with open('../README.md','wt') as fp:
fp.write(readme)
print('README.md is updated.')
```
%% Output
README.md is updated.
%% Cell type:markdown id: tags:
## Step 5 - README.ipynb
Just execute README.ipynb
%% Cell type:raw id: tags:
# ---- Load notebook
#
notebook = nbformat.read('../README.ipynb', nbformat.NO_CONVERT)
# new_cell = nbformat.v4.new_markdown_cell(source=readme)
# notebook.cells.append(new_cell)
# ---- Execute it
#
ep = ExecutePreprocessor(timeout=600, kernel_name="python3")
ep.preprocess(notebook, {'metadata': {'path': '..'}})
# ---- Save it
with open('../READMEv2.ipynb', mode="w", encoding='utf-8') as fp:
nbformat.write(notebook)
%% Cell type:markdown id: tags:
## Step 6 - More fun : Create and execute it !
%% Cell type:markdown id: tags:
Plus rigolo, on va fabriquer le README.ipynb et l'executer :-)
%% Cell type:code id: tags:
``` python
# ---- Create Notebook
# ---- Create Notebook from scratch
#
notebook = nbformat.v4.new_notebook()
new_cell = nbformat.v4.new_markdown_cell(source=readme)
# ---- Add a code cell
#
code = "from IPython.display import display,Markdown\n"
code+= "display(Markdown(open('README.md', 'r').read()))\n"
code+= "#\n"
code+= "# This README is visible under Jupiter LAb ! :-)"
new_cell = nbformat.v4.new_code_cell(source=code)
new_cell['metadata']= { "jupyter": { "source_hidden": True} }
notebook.cells.append(new_cell)
# ---- Run it
#
ep = ExecutePreprocessor(timeout=600, kernel_name="python3")
ep.preprocess(notebook)
ep.preprocess(notebook, {'metadata': {'path': '..'}})
# ---- Save it
#
with open('../READMEv2.ipynb', mode="w", encoding='utf-8') as fp:
with open('../README.ipynb', mode="w", encoding='utf-8') as fp:
nbformat.write(notebook, fp)
```
%% Cell type:code id: tags:
%% Cell type:markdown id: tags:
``` python
```
---
<img width="80px" src="../fidle/img/00-Fidle-logo-01.svg"></img>
......
......@@ -48,7 +48,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.6"
"version": "3.7.7"
}
},
"nbformat": 4,
......
VERSION='0.1a'
\ No newline at end of file
# -----------------------------------------------------------------------------
# ____ _ _ ____ _ _ _
# / ___|__ _| |_ __ _| | ___ __ _ | __ ) _ _(_) | __| | ___ _ __
# | | / _` | __/ _` | |/ _ \ / _` | | _ \| | | | | |/ _` |/ _ \ '__|
# | |__| (_| | || (_| | | (_) | (_| | | |_) | |_| | | | (_| | __/ |
# \____\__,_|\__\__,_|_|\___/ \__, | |____/ \__,_|_|_|\__,_|\___|_|
# |___/ Module catalog_builder
# -----------------------------------------------------------------------------
#
# A simple module to build the notebook catalog and update the README.
import nbformat
from nbconvert.preprocessors import ExecutePreprocessor
import re
import sys, os, glob
import json
from collections import OrderedDict
def get_notebooks(directories, top_dir='..'):
'''
Return a list of notebooks from a given list of directories
args:
directories : list of directories
top_dir : location of theses directories
return:
notebooks : notebooks filename list (without top_dir prefix)
'''
notebooks = []
for d in directories:
filenames = glob.glob( f'{top_dir}/{d}/*.ipynb')
filenames.sort()
notebooks.extend(filenames)
notebooks = [ x.replace(f'{top_dir}/','') for x in notebooks]
return notebooks
def get_infos(filename, top_dir='..'):
'''
Extract informations from a fidle notebook.
Informations are dirname, basename, id, title, description and are extracted from comments tags in markdown.
args:
filename : Notebook filename
return:
dict : with infos.
'''
about={}
about['dirname'] = os.path.dirname(filename)
about['basename'] = os.path.basename(filename)
about['id'] = '??'
about['title'] = '??'
about['description'] = '??'
# ---- Read notebook
#
notebook = nbformat.read(f'{top_dir}/{filename}', nbformat.NO_CONVERT)
# ---- Get id, title and desc tags
#
for cell in notebook.cells:
if cell['cell_type'] == 'markdown':
find = re.findall(r'<\!-- TITLE -->\s*\[(.*)\]\s*-\s*(.*)\n',cell.source)
if find:
about['id'] = find[0][0]
about['title'] = find[0][1]
find = re.findall(r'<\!-- DESC -->\s*(.*)\n',cell.source)
if find:
about['description'] = find[0]
return about
def get_catalog(notebooks_list, top_dir='..'):
'''
Return an OrderedDict of notebooks attributes.
Keys are notebooks id.
args:
notebooks_list : list of notebooks filenames
top_dir : Location of theses notebooks
return:
OrderedDict : {<notebook id> : { description} }
'''
catalog = OrderedDict()
for nb in notebooks_list:
about = get_infos(nb, top_dir='..')
id=about['id']
catalog[id] = about
return catalog
def tag(tag, text, document):
debut = f'<!-- {tag}_BEGIN -->'
fin = f'<!-- {tag}_END -->'
output = re.sub(f'{debut}.*{fin}',f'{debut}\n{text}\n{fin}',document, flags=re.DOTALL)
return output
\ No newline at end of file
......@@ -14,7 +14,7 @@
# ---- Version -----------------------------------------------------
#
VERSION = '0.6.0 DEV'
VERSION = '0.6.1 DEV'
# ---- Default notebook name ---------------------------------------
#
......@@ -31,4 +31,4 @@ FINISHED_FILE = '../fidle/log/finished_file.json'
# ---- Catalog file, a json description of all notebooks
#
CATALOG_FILE = '../fidle/log/catalog_file.json'
\ No newline at end of file
CATALOG_FILE = '../fidle/log/catalog_nb.json'
\ No newline at end of file
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 36.8447 35.5347"><path d="M19.8818,8.3758V.9771H9.6714A3.2082,3.2082,0,0,0,6.4726,4.1759V27.6783a3.2081,3.2081,0,0,0,3.1988,3.1988H27.2329a3.2082,3.2082,0,0,0,3.1988-3.1988V11.5746H23.0806A3.2082,3.2082,0,0,1,19.8818,8.3758Z" style="fill:none;stroke:#000;stroke-linejoin:round;stroke-width:1.95428428707763px"/><line x1="19.8818" y1="0.9771" x2="30.4555" y2="11.5508" style="fill:none;stroke:#000;stroke-linejoin:round;stroke-width:1.95428428707763px"/><rect y="5.907" width="16.1948" height="17.0887" style="fill:#fff"/><path d="M20.3292,26.3928H24.002V18.63a1.1822,1.1822,0,0,1,1.1807-1.1807h6.208a1.1833,1.1833,0,0,1,1.1807,1.1753l.0351,7.7681h3.4029l-7.6367,8.5855Z" style="fill:#e12229"/><path d="M31.3905,17.8243a.8058.8058,0,0,1,.8058.8021l.0366,8.1415h2.9409l-6.8084,7.6544-7.171-7.6544h3.1827V18.63a.8057.8057,0,0,1,.8057-.8057h6.2077m0-.75H25.1828A1.5575,1.5575,0,0,0,23.6271,18.63v7.3879h-4.163l1.183,1.2628,7.171,7.6545.5617.5995.546-.6139,6.8084-7.6545,1.1105-1.2484H32.98l-.0333-7.3949a1.559,1.559,0,0,0-1.5558-1.5487Z" style="fill:#fff"/><path d="M14.901,9.12l0,.01c-.0174,1.0847-2.8156,1.9623-6.2648,1.9623S2.3887,10.2147,2.3712,9.13l0-.01c0-1.0894,2.8048-1.9725,6.2649-1.9725S14.901,8.03,14.901,9.12Z" style="fill:#e12229"/><path d="M14.9214,19.7408l0,.01c-.0174,1.0847-2.8156,1.9622-6.2648,1.9622s-6.2473-.8775-6.2648-1.9622l0-.01-.0041-2.779,0,.01c.0175,1.0847,2.8157,1.9622,6.2648,1.9622s6.2474-.8775,6.2648-1.9622l0-.01Z" style="fill:#e12229"/><path d="M14.9072,16.2168l0,.01c-.0175,1.0846-2.8157,1.9622-6.2648,1.9622s-6.2474-.8776-6.2648-1.9622l0-.01-.0041-2.779,0,.01C2.3908,14.5328,5.189,15.41,8.6382,15.41s6.2474-.8776,6.2648-1.9622l0-.01Z" style="fill:#e12229"/><path d="M14.9083,12.66l0,.01c-.0175,1.0847-2.8157,1.9622-6.2648,1.9622S2.396,13.7551,2.3786,12.67l0-.01-.0041-2.779,0,.01c.0174,1.0847,2.8156,1.9622,6.2648,1.9622s6.2473-.8775,6.2648-1.9622l0-.01Z" style="fill:#e12229"/></svg>
\ No newline at end of file
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 34.4736 35.5346"><path d="M17.5108,8.3758V.9771H7.3A3.2083,3.2083,0,0,0,4.1015,4.1759V27.6783A3.2082,3.2082,0,0,0,7.3,30.8771H24.8618a3.2081,3.2081,0,0,0,3.1988-3.1988V11.5746H20.71A3.2081,3.2081,0,0,1,17.5108,8.3758Z" style="fill:#fff;stroke:#000;stroke-linejoin:round;stroke-width:1.95428428707763px"/><line x1="17.5108" y1="0.9771" x2="28.0844" y2="11.5508" style="fill:#fff;stroke:#000;stroke-linejoin:round;stroke-width:1.95428428707763px"/><rect x="3.0246" y="6.2946" width="2.1612" height="16.2901" style="fill:#fff"/><path d="M17.9579,26.3927h3.6729V18.63a1.1822,1.1822,0,0,1,1.1811-1.1806h6.2075A1.1833,1.1833,0,0,1,30.2,18.6251l.0352,7.7676h3.4028l-7.6362,8.586Z" style="fill:#e12229"/><path d="M29.02,17.8244a.8057.8057,0,0,1,.8057.802l.0367,8.1414h2.9409l-6.8084,7.6545-7.171-7.6545H22.006V18.63a.8057.8057,0,0,1,.8057-.8056H29.02m0-.75H22.8117A1.5574,1.5574,0,0,0,21.256,18.63v7.3878H17.0931l1.183,1.2629,7.1709,7.6544.5617.5995.5461-.6139,6.8084-7.6544,1.11-1.2485H30.6085l-.0333-7.3949A1.5589,1.5589,0,0,0,29.02,17.0744Z" style="fill:#fff"/><path d="M12.53,9.12l0,.01c-.0175,1.0847-2.8157,1.9623-6.2648,1.9623S.0176,10.2147,0,9.13l0-.01C0,8.03,2.8049,7.1472,6.265,7.1472S12.53,8.03,12.53,9.12Z" style="fill:#e12229"/><path d="M12.55,19.7408l0,.01c-.0175,1.0847-2.8157,1.9622-6.2648,1.9622S.038,20.8358.0206,19.7511l0-.01-.0041-2.779,0,.01c.0174,1.0847,2.8156,1.9622,6.2648,1.9622s6.2473-.8775,6.2648-1.9622l0-.01Z" style="fill:#e12229"/><path d="M12.5362,16.2168l0,.01c-.0174,1.0846-2.8156,1.9622-6.2648,1.9622S.0239,17.3117.0064,16.2271l0-.01-.004-2.779,0,.01C.02,14.5328,2.818,15.41,6.2671,15.41s6.2474-.8776,6.2649-1.9622l0-.01Z" style="fill:#e12229"/><path d="M12.5373,12.66l0,.01c-.0174,1.0847-2.8156,1.9622-6.2648,1.9622S.025,13.7551.0075,12.67l0-.01L.0033,9.8811l0,.01c.0175,1.0847,2.8157,1.9622,6.2648,1.9622s6.2474-.8775,6.2648-1.9622l0-.01Z" style="fill:#e12229"/></svg>
\ No newline at end of file
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 6.6741 6.6889"><title>00-Fidle-pdf</title><g id="Calque_2" data-name="Calque 2"><g id="Contains_1" data-name="Contains #1"><path d="M3.4812,1.5766V.1839H1.5592A.604.604,0,0,0,.9571.7861V5.21a.6039.6039,0,0,0,.6021.6021H4.8649A.6038.6038,0,0,0,5.467,5.21V2.1787H4.0833A.6038.6038,0,0,1,3.4812,1.5766Z" style="fill:none;stroke:#000;stroke-linejoin:round;stroke-width:0.3678616155029277px"/><line x1="3.4812" y1="0.1839" x2="5.4715" y2="2.1742" style="fill:none;stroke:#000;stroke-linejoin:round;stroke-width:0.3678616155029277px"/><path d="M.3844,1.3505H2.8073a.3145.3145,0,0,1,.3145.3145V2.9521a.314.314,0,0,1-.314.314H.3844a.314.314,0,0,1-.314-.314V1.6645A.314.314,0,0,1,.3844,1.3505Z" style="fill:#e12229"/><path d="M2.8077,1.4216a.2439.2439,0,0,1,.2433.2431V2.9521a.2441.2441,0,0,1-.2433.2434H.3845a.2441.2441,0,0,1-.2433-.2434V1.6647a.2439.2439,0,0,1,.2433-.2431H2.8077m0-.1412H.3845A.3848.3848,0,0,0,0,1.6647V2.9521a.3849.3849,0,0,0,.3845.3845H2.8077a.3849.3849,0,0,0,.3846-.3845V1.6647A.3849.3849,0,0,0,2.8077,1.28Z" style="fill:#fff"/><path d="M.4142,1.78a1.5711,1.5711,0,0,1,.2851-.0225.4581.4581,0,0,1,.3067.0865.3187.3187,0,0,1,.105.2451.3526.3526,0,0,1-.0943.2583.4365.4365,0,0,1-.311.11.6411.6411,0,0,1-.0679-.0034v.3911H.4142Zm.2236.49a.24.24,0,0,0,.0645.0049.1658.1658,0,0,0,.185-.1748.15.15,0,0,0-.1679-.1616.303.303,0,0,0-.0816.0078Z" style="fill:#fff"/><path d="M1.2535,1.78a1.7814,1.7814,0,0,1,.2837-.0225.5689.5689,0,0,1,.3838.11.5091.5091,0,0,1,.1631.415.5735.5735,0,0,1-.1631.4439.6385.6385,0,0,1-.4316.13,1.5427,1.5427,0,0,1-.2359-.0162Zm.2251.8862a.3211.3211,0,0,0,.0679.0044c.1773.002.3-.1138.3-.3843.0015-.2226-.1045-.3462-.2817-.3462a.3427.3427,0,0,0-.0865.0084Z" style="fill:#fff"/><path d="M2.233,1.7655h.58v.1939H2.4581v.2583h.3311v.19H2.4581v.4375H2.233Z" style="fill:#fff"/><path d="M3.5651,4.9677h.7622L4.2565,3.5068a.2232.2232,0,0,1,.2226-.2227H5.6476a.2224.2224,0,0,1,.2222.2222L5.8771,5.038l.4824-.07h.1572L5.08,6.5839Z" style="fill:#e12229"/><path d="M5.6475,3.3551a.1516.1516,0,0,1,.1517.1509L5.806,5.0385H6.36L5.0781,6.4794l-1.35-1.4409h.5991V3.5068a.1517.1517,0,0,1,.1516-.1517H5.6475m0-.1411H4.479a.2931.2931,0,0,0-.2928.2928V4.8974H3.4026l.2226.2375,1.35,1.4409.1057.1131.1028-.1157,1.2815-1.441.209-.2348H5.9465L5.94,3.5053a.2932.2932,0,0,0-.2928-.2913Z" style="fill:#fff"/></g></g></svg>
\ No newline at end of file
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 35.4564 35.5388"><path d="M29.0428,11.5771v16.11a3.2073,3.2073,0,0,1-3.2,3.19H8.2828a3.2073,3.2073,0,0,1-3.2-3.19V4.1771a3.2094,3.2094,0,0,1,3.2-3.2h10.21v7.4a3.2094,3.2094,0,0,0,3.2,3.2Z" style="fill:#fff"/><path d="M29.0428,11.5771h-7.35a3.2094,3.2094,0,0,1-3.2-3.2v-7.4H8.2828a3.2094,3.2094,0,0,0-3.2,3.2v23.51a3.2073,3.2073,0,0,0,3.2,3.19h17.56a3.2073,3.2073,0,0,0,3.2-3.19Z" style="fill:none;stroke:#000;stroke-linejoin:round;stroke-width:1.95428428707763px"/><line x1="18.4928" y1="0.9771" x2="29.0628" y2="11.5571" style="fill:none;stroke:#000;stroke-linejoin:round;stroke-width:1.95428428707763px"/><path d="M2.0421,7.1807h12.874a1.6675,1.6675,0,0,1,1.6675,1.6675v6.8394a1.6675,1.6675,0,0,1-1.6675,1.6675H2.0426A1.6675,1.6675,0,0,1,.3751,15.6876v-6.84A1.667,1.667,0,0,1,2.0421,7.1807Z" style="fill:#e12229"/><path d="M14.9161,7.5559a1.2961,1.2961,0,0,1,1.2923,1.2923v6.84A1.2962,1.2962,0,0,1,14.9161,16.98H2.0423A1.2961,1.2961,0,0,1,.75,15.6878v-6.84A1.2961,1.2961,0,0,1,2.0423,7.5559H14.9161m0-.75H2.0423A2.0446,2.0446,0,0,0,0,8.8482v6.84A2.0446,2.0446,0,0,0,2.0423,17.73H14.9161a2.0446,2.0446,0,0,0,2.0423-2.0423v-6.84a2.0446,2.0446,0,0,0-2.0423-2.0423Z" style="fill:#fff"/><path d="M2.2018,9.4488A8.3321,8.3321,0,0,1,3.7179,9.33a2.4056,2.4056,0,0,1,1.6386.46,1.6905,1.6905,0,0,1,.5489,1.3052,1.9307,1.9307,0,0,1-.4917,1.373,2.3388,2.3388,0,0,1-1.6631.5889,2.172,2.172,0,0,1-.3609-.0259v2.09H2.2018Zm1.1879,2.61a1.8559,1.8559,0,0,0,.3443.0253.8858.8858,0,0,0,.9834-.93.8009.8009,0,0,0-.8931-.8614,1.6415,1.6415,0,0,0-.4346.0425Z" style="fill:#fff"/><path d="M6.66,9.4488A9.4157,9.4157,0,0,1,8.1676,9.33a3.0234,3.0234,0,0,1,2.04.5883,2.6953,2.6953,0,0,1,.8687,2.209,3.0427,3.0427,0,0,1-.8687,2.3628,3.3479,3.3479,0,0,1-2.2861.6909A8.5047,8.5047,0,0,1,6.66,15.0958ZM7.856,14.1744a2.3257,2.3257,0,0,0,.36.0171c.9424.0083,1.5981-.6055,1.5981-2.0474,0-1.1855-.5571-1.8423-1.5-1.8423a1.9165,1.9165,0,0,0-.459.0425Z" style="fill:#fff"/><path d="M11.8634,9.3721h3.081v1.0323H13.06v1.373h1.7617v1.0152H13.06v2.3286H11.8634Z" style="fill:#fff"/><path d="M18.941,26.397h3.6729V18.6343a1.1821,1.1821,0,0,1,1.1806-1.1806H30.002a1.1835,1.1835,0,0,1,1.1812,1.1748l.0352,7.7685h3.4028l-7.6367,8.5855Z" style="fill:#e12229"/><path d="M30.0022,17.8286a.8057.8057,0,0,1,.8058.802l.0366,8.1414h2.941l-6.8085,7.6545L19.8062,26.772h3.1826V18.6342a.8056.8056,0,0,1,.8057-.8056h6.2077m0-.75H23.7945a1.5574,1.5574,0,0,0-1.5557,1.5556V26.022h-4.163l1.183,1.2629,7.171,7.6544.5617.5995.546-.6139,6.8084-7.6544,1.1105-1.2485H31.5913l-.0333-7.3949a1.559,1.559,0,0,0-1.5558-1.5485Z" style="fill:#fff"/></svg>
\ No newline at end of file
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 36.8447 35.5346"><path d="M19.8818,8.3758V.9771H9.6714A3.2082,3.2082,0,0,0,6.4726,4.1759V27.6783a3.2081,3.2081,0,0,0,3.1988,3.1988H27.2329a3.2082,3.2082,0,0,0,3.1988-3.1988V11.5746H23.0806A3.2082,3.2082,0,0,1,19.8818,8.3758Z" style="fill:none;stroke:#000;stroke-linejoin:round;stroke-width:1.95428428707763px"/><line x1="19.8818" y1="0.9771" x2="30.4555" y2="11.5508" style="fill:none;stroke:#000;stroke-linejoin:round;stroke-width:1.95428428707763px"/><rect y="5.907" width="16.1948" height="17.0887" style="fill:#fff"/><path d="M20.3294,26.3927h3.6728V18.63a1.1821,1.1821,0,0,1,1.1807-1.1806H31.39a1.1834,1.1834,0,0,1,1.1807,1.1757l.0351,7.7676h3.4029l-7.6363,8.586Z" style="fill:#e12229"/><path d="M31.3905,17.8244a.8056.8056,0,0,1,.8057.802l.0367,8.1414h2.9409l-6.8084,7.6545-7.171-7.6545h3.1827V18.63a.8056.8056,0,0,1,.8057-.8056h6.2077m0-.75H25.1828A1.5574,1.5574,0,0,0,23.6271,18.63v7.3878h-4.163l1.183,1.2629,7.171,7.6544.5617.5995.546-.6139,6.8084-7.6544,1.1105-1.2485H32.98l-.0334-7.3949a1.5588,1.5588,0,0,0-1.5557-1.5485Z" style="fill:#fff"/><path d="M13.6929,12.4289C12.5982,6.1627,7.3243,6.71,7.3243,6.71l0,0a5.71,5.71,0,0,0-5.7108,5.7108A7.1617,7.1617,0,0,0,3.8012,17.62v4.5846h6.3847V19.8412h2.0285A1.2317,1.2317,0,0,0,13.4452,18.61v-2.05c.4531-.1023,1.5854-.5277,1.5415-.8977C14.9375,15.24,13.6929,12.4289,13.6929,12.4289Zm-2.7057.5477a1.104,1.104,0,0,1-1.552,0,1.1322,1.1322,0,0,1-.3064-.613H7.6953a.864.864,0,0,1-.3459.4251l.4745,1.7794a.6633.6633,0,0,1,.4448.178.6449.6449,0,0,1,0,.89.6291.6291,0,1,1-.7612-.9886l-.4646-1.74a.8835.8835,0,0,1-.7118-.1977l-1.0972.6327a1.0449,1.0449,0,0,1,.079.3954,1.0963,1.0963,0,1,1-.2372-.6723l1.0478-.603a.8683.8683,0,0,1-.0988-.346l-1.6015-.4251a.6291.6291,0,1,1-.9885-.7612.6451.6451,0,0,1,.89,0,.6737.6737,0,0,1,.1878.4449l1.5421.4152a.9791.9791,0,0,1,.3065-.4448l-.5239-.9095a1.0965,1.0965,0,1,1,.692-1.0182,1.0915,1.0915,0,0,1-.3164.771c-.1878.1879.3164.8008.4449,1.0182a.8586.8586,0,0,1,.85.2274l1.2159-1.1862a.642.642,0,0,1,.0989-.7612.6291.6291,0,0,1,.89.89.642.642,0,0,1-.7612.0988L7.7052,11.731a1.0517,1.0517,0,0,1,.0593.3064H9.1288a1.1674,1.1674,0,0,1,.3064-.6129,1.0975,1.0975,0,0,1,1.552,1.5521Z" style="fill:#e12229"/></svg>
\ No newline at end of file
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 35.8683 35.5346"><path d="M18.9054,8.3758V.9771H8.695A3.2082,3.2082,0,0,0,5.4962,4.1759V27.6783A3.2081,3.2081,0,0,0,8.695,30.8771H26.2565a3.2082,3.2082,0,0,0,3.1988-3.1988V11.5746H22.1042A3.2082,3.2082,0,0,1,18.9054,8.3758Z" style="fill:#fff;stroke:#000;stroke-linejoin:round;stroke-width:1.95428428707763px"/><line x1="18.9054" y1="0.9771" x2="29.4791" y2="11.5508" style="fill:#fff;stroke:#000;stroke-linejoin:round;stroke-width:1.95428428707763px"/><path d="M19.353,26.3927h3.6728V18.63a1.1821,1.1821,0,0,1,1.1807-1.1806H30.414a1.1834,1.1834,0,0,1,1.1807,1.1757l.0351,7.7676h3.4029l-7.6363,8.586Z" style="fill:#e12229"/><path d="M30.4141,17.8244a.8056.8056,0,0,1,.8057.802l.0367,8.1414h2.9409L27.389,34.4223l-7.171-7.6545h3.1827V18.63a.8056.8056,0,0,1,.8057-.8056h6.2077m0-.75H24.2064A1.5574,1.5574,0,0,0,22.6507,18.63v7.3878h-4.163l1.183,1.2629,7.171,7.6544.5617.5995.546-.6139,6.8084-7.6544,1.11-1.2485H32.0032L31.97,18.6229a1.5588,1.5588,0,0,0-1.5557-1.5485Z" style="fill:#fff"/><path d="M6.6572,5.9063A6.1461,6.1461,0,0,0,.51,12.0537,7.71,7.71,0,0,0,2.8646,17.65v4.935H9.7374V20.0409H11.921a1.3259,1.3259,0,0,0,1.3249-1.3249V16.5093c.4877-.11,1.7066-.568,1.6594-.9663-.053-.4554-1.3928-3.4811-1.3928-3.4811C12.3342,5.3167,6.6571,5.9062,6.6571,5.9062Z" style="fill:#e12229;stroke:#fff;stroke-miterlimit:10"/><path d="M4.2366,12.7472l1.1279-.6491a.9361.9361,0,0,1-.1064-.3724l-1.7239-.4576A.6772.6772,0,1,1,2.47,10.4487a.6943.6943,0,0,1,.9577,0,.7247.7247,0,0,1,.2022.4789l1.66.4469a1.0535,1.0535,0,0,1,.33-.4788l-.564-.979a1.18,1.18,0,1,1,.7449-1.0961,1.1748,1.1748,0,0,1-.34.83c-.2022.2022.34.862.4789,1.0961a.9242.9242,0,0,1,.9151.2447L8.1632,9.7145A.6909.6909,0,0,1,8.27,8.8951a.6772.6772,0,0,1,.9577.9577.6909.6909,0,0,1-.8194.1064L7.0671,11.3107a1.1313,1.1313,0,0,1,.0639.33H8.6a1.2569,1.2569,0,0,1,.33-.66,1.1814,1.1814,0,1,1,0,1.6707,1.2192,1.2192,0,0,1-.33-.66H7.0565a.93.93,0,0,1-.3725.4576l.5108,1.9154a.7147.7147,0,0,1,.4789.1915.6944.6944,0,0,1,0,.9578A.6772.6772,0,1,1,6.8543,14.45l-.5-1.8728a.9507.9507,0,0,1-.7662-.2128l-1.1812.681a1.1251,1.1251,0,0,1,.0851.4256,1.18,1.18,0,1,1-.2553-.7236Z" style="fill:#fff"/></svg>
\ No newline at end of file
{
"LINR1": {
"dirname": "LinearReg",
"basename": "01-Linear-Regression.ipynb",
"id": "LINR1",
"title": "Linear regression with direct resolution",
"description": "Direct determination of linear regression "
},
"GRAD1": {
"dirname": "LinearReg",
"basename": "02-Gradient-descent.ipynb",
"id": "GRAD1",
"title": "Linear regression with gradient descent",
"description": "An example of gradient descent in the simple case of a linear regression."
},
"POLR1": {
"dirname": "LinearReg",
"basename": "03-Polynomial-Regression.ipynb",
"id": "POLR1",
"title": "Complexity Syndrome",
"description": "Illustration of the problem of complexity with the polynomial regression"
},
"LOGR1": {
"dirname": "LinearReg",
"basename": "04-Logistic-Regression.ipynb",
"id": "LOGR1",
"title": "Logistic regression, in pure Tensorflow",
"description": "Logistic Regression with Mini-Batch Gradient Descent using pure TensorFlow. "
},
"PER57": {
"dirname": "IRIS",
"basename": "01-Simple-Perceptron.ipynb",
"id": "PER57",
"title": "Perceptron Model 1957",
"description": "A simple perceptron, with the IRIS dataset."
},
"BHP1": {
"dirname": "BHPD",
"basename": "01-DNN-Regression.ipynb",
"id": "BHP1",
"title": "Regression with a Dense Network (DNN)",
"description": "A Simple regression with a Dense Neural Network (DNN) - BHPD dataset"
},
"BHP2": {
"dirname": "BHPD",
"basename": "02-DNN-Regression-Premium.ipynb",
"id": "BHP2",
"title": "Regression with a Dense Network (DNN) - Advanced code",
"description": "More advanced example of DNN network code - BHPD dataset"
},
"MNIST1": {
"dirname": "MNIST",
"basename": "01-DNN-MNIST.ipynb",
"id": "MNIST1",
"title": "Simple classification with DNN",
"description": "Example of classification with a fully connected neural network"
},
"GTS1": {
"dirname": "GTSRB",
"basename": "01-Preparation-of-data.ipynb",
"id": "GTS1",
"title": "CNN with GTSRB dataset - Data analysis and preparation",
"description": "Episode 1 : Data analysis and creation of a usable dataset"
},
"GTS2": {
"dirname": "GTSRB",
"basename": "02-First-convolutions.ipynb",
"id": "GTS2",
"title": "CNN with GTSRB dataset - First convolutions",
"description": "Episode 2 : First convolutions and first results"
},
"GTS3": {
"dirname": "GTSRB",
"basename": "03-Tracking-and-visualizing.ipynb",
"id": "GTS3",
"title": "CNN with GTSRB dataset - Monitoring ",
"description": "Episode 3 : Monitoring and analysing training, managing checkpoints"
},
"GTS4": {
"dirname": "GTSRB",
"basename": "04-Data-augmentation.ipynb",
"id": "GTS4",
"title": "CNN with GTSRB dataset - Data augmentation ",
"description": "Episode 4 : Improving the results with data augmentation"
},
"GTS5": {
"dirname": "GTSRB",
"basename": "05-Full-convolutions.ipynb",
"id": "GTS5",
"title": "CNN with GTSRB dataset - Full convolutions ",
"description": "Episode 5 : A lot of models, a lot of datasets and a lot of results."
},
"GTS6": {
"dirname": "GTSRB",
"basename": "06-Notebook-as-a-batch.ipynb",
"id": "GTS6",
"title": "CNN with GTSRB dataset - Full convolutions as a batch",
"description": "Episode 6 : Run Full convolution notebook as a batch"
},
"GTS7": {
"dirname": "GTSRB",
"basename": "07-Show-report.ipynb",
"id": "GTS7",
"title": "CNN with GTSRB dataset - Show reports",
"description": "Episode 7 : Displaying the reports of the different jobs"
},
"TSB1": {
"dirname": "GTSRB",
"basename": "99-Scripts-Tensorboard.ipynb",
"id": "TSB1",
"title": "Tensorboard with/from Jupyter ",
"description": "4 ways to use Tensorboard from the Jupyter environment"
},
"IMDB1": {
"dirname": "IMDB",
"basename": "01-Embedding-Keras.ipynb",
"id": "IMDB1",
"title": "Text embedding with IMDB",
"description": "A very classical example of word embedding for text classification (sentiment analysis)"
},
"IMDB2": {
"dirname": "IMDB",
"basename": "02-Prediction.ipynb",
"id": "IMDB2",
"title": "Text embedding with IMDB - Reloaded",
"description": "Example of reusing a previously saved model"
},
"IMDB3": {
"dirname": "IMDB",
"basename": "03-LSTM-Keras.ipynb",
"id": "IMDB3",
"title": "Text embedding/LSTM model with IMDB",
"description": "Still the same problem, but with a network combining embedding and LSTM"
},
"SYNOP1": {
"dirname": "SYNOP",
"basename": "01-Preparation-of-data.ipynb",
"id": "SYNOP1",
"title": "Time series with RNN - Preparation of data",
"description": "Episode 1 : Data analysis and creation of a usable dataset"
},
"SYNOP2": {
"dirname": "SYNOP",
"basename": "02-First-predictions.ipynb",
"id": "SYNOP2",
"title": "Time series with RNN - Try a prediction",
"description": "Episode 2 : Training session and first predictions"
},
"SYNOP3": {
"dirname": "SYNOP",
"basename": "03-12h-predictions.ipynb",
"id": "SYNOP3",
"title": "Time series with RNN - 12h predictions",
"description": "Episode 3: Attempt to predict in the longer term "
},
"VAE1": {
"dirname": "VAE",
"basename": "01-VAE-with-MNIST.nbconvert.ipynb",
"id": "VAE1",
"title": "Variational AutoEncoder (VAE) with MNIST",
"description": "Episode 1 : Model construction and Training"
},
"VAE2": {
"dirname": "VAE",
"basename": "02-VAE-with-MNIST-post.ipynb",
"id": "VAE2",
"title": "Variational AutoEncoder (VAE) with MNIST - Analysis",
"description": "Episode 2 : Exploring our latent space"
},
"VAE3": {
"dirname": "VAE",
"basename": "03-About-CelebA.ipynb",
"id": "VAE3",
"title": "About the CelebA dataset",
"description": "Episode 3\u00a0: About the CelebA dataset, a more fun dataset ;-)"
},
"VAE4": {
"dirname": "VAE",
"basename": "04-Prepare-CelebA-datasets.ipynb",
"id": "VAE4",
"title": "Preparation of the CelebA dataset",
"description": "Episode 4\u00a0: Preparation of a clustered dataset, batchable"
},
"VAE5": {
"dirname": "VAE",
"basename": "05-Check-CelebA.ipynb",
"id": "VAE5",
"title": "Checking the clustered CelebA dataset",
"description": "Episode 5\u00a0:\tChecking the clustered dataset"
},
"VAE6": {
"dirname": "VAE",
"basename": "06-VAE-with-CelebA-s.nbconvert.ipynb",
"id": "VAE6",
"title": "Variational AutoEncoder (VAE) with CelebA (small)",
"description": "Episode 6\u00a0: Variational AutoEncoder (VAE) with CelebA (small res.)"
},
"VAE7": {
"dirname": "VAE",
"basename": "07-VAE-with-CelebA-m.nbconvert.ipynb",
"id": "VAE7",
"title": "Variational AutoEncoder (VAE) with CelebA (medium)",
"description": "Episode 7\u00a0: Variational AutoEncoder (VAE) with CelebA (medium res.)"
},
"VAE8": {
"dirname": "VAE",
"basename": "08-VAE-withCelebA-post.ipynb",
"id": "VAE8",
"title": "Variational AutoEncoder (VAE) with CelebA - Analysis",
"description": "Episode 8\u00a0: Exploring latent space of our trained models"
},
"ACTF1": {
"dirname": "Misc",
"basename": "Activation-Functions.ipynb",
"id": "ACTF1",
"title": "Activation functions",
"description": "Some activation functions, with their derivatives."
},
"NP1": {
"dirname": "Misc",
"basename": "Numpy.ipynb",
"id": "NP1",
"title": "A short introduction to Numpy",
"description": "Numpy is an essential tool for the Scientific Python."
}
}
\ No newline at end of file
# ==================================================================
# ____ _ _ _ __ __ _
# | _ \ _ __ __ _ ___| |_(_) ___ __ _| | \ \ / /__ _ __| | __
# | |_) | '__/ _` |/ __| __| |/ __/ _` | | \ \ /\ / / _ \| '__| |/ /
# | __/| | | (_| | (__| |_| | (_| (_| | | \ V V / (_) | | | <
# |_| |_| \__,_|\___|\__|_|\___\__,_|_| \_/\_/ \___/|_| |_|\_\
# module pwk
# ==================================================================
# A simple module to host some common functions for practical work
# Jean-Luc Parouty 2020
import os
import glob
import shutil
from datetime import datetime
import itertools
import datetime, time
import math
import numpy as np
from collections.abc import Iterable
import tensorflow as tf
from tensorflow import keras
from sklearn.metrics import confusion_matrix
import pandas as pd
import matplotlib
import matplotlib.pyplot as plt
#import seaborn as sn #IDRIS : module en cours d'installation
from IPython.display import display,Image,Markdown,HTML
import fidle.config as config
_save_figs = False
_figs_dir = './figs'
_figs_name = 'fig_'
_figs_id = 0
# -------------------------------------------------------------
# init_all
# -------------------------------------------------------------
#
def init( mplstyle='../fidle/mplstyles/custom.mplstyle',
cssfile='../fidle/css/custom.css',
places={ 'SOMEWHERE' : '/path/to/datasets'}):
update_keras_cache=False
# ---- Predifined places
#
predefined_places = config.locations
# ---- Load matplotlib style and css
#
matplotlib.style.use(mplstyle)
load_cssfile(cssfile)
# ---- Create subdirs
#
mkdir('./run')
# ---- Try to find where we are
#
place_name, dataset_dir = where_we_are({**places, **predefined_places})
# ---- If we are at IDRIS, we need to copy datasets/keras_cache to keras cache...
#
if place_name=='Fidle at IDRIS':
from_dir = f'{dataset_dir}/keras_cache/*.*'
to_dir = os.path.expanduser('~/.keras/datasets')
mkdir(to_dir)
for pathname in glob.glob(from_dir):
filename=os.path.basename(pathname)
destname=f'{to_dir}/{filename}'
if not os.path.isfile(destname):
shutil.copy(pathname, destname)
update_keras_cache=True
# ---- Hello world
print('\nFIDLE 2020 - Practical Work Module')
print('Version :', config.VERSION)
print('Run time : {}'.format(time.strftime("%A %-d %B %Y, %H:%M:%S")))
print('TensorFlow version :',tf.__version__)
print('Keras version :',tf.keras.__version__)
print('Current place :',place_name )
print('Datasets dir :',dataset_dir)
if update_keras_cache:
print('Update keras cache : Done')
return place_name, dataset_dir
# -------------------------------------------------------------
# Folder cooking
# -------------------------------------------------------------
#
def tag_now():
return datetime.datetime.now().strftime("%Y-%m-%d_%Hh%Mm%Ss")
def mkdir(path):
os.makedirs(path, mode=0o750, exist_ok=True)
def get_directory_size(path):
"""
Return the directory size, but only 1 level
args:
path : directory path
return:
size in Mo
"""
size=0
for f in os.listdir(path):
if os.path.isfile(path+'/'+f):
size+=os.path.getsize(path+'/'+f)
return size/(1024*1024)
# ------------------------------------------------------------------
# Where we are ?
# ------------------------------------------------------------------
#
def where_we_are(places):
for place_name, place_dir in places.items():
if os.path.isdir(place_dir):
return place_name,place_dir
print('** ERROR ** : Le dossier datasets est introuvable\n')
print(' Vous devez :\n')
print(' 1/ Récupérer le dossier datasets')
print(' Une archive (datasets.tar) est disponible via le repo Fidle.\n')
print(" 2/ Préciser la localisation de ce dossier datasets")
print(" Soit dans le fichier fidle/config.py (préférable)")
print(" Soit via un paramètre à la fonction ooo.init()\n")
print(' Par exemple :')
print(" ooo.init( places={ 'Chez-moi':'/tmp/datasets', 'Sur-mon-cluster':'/tests/datasets'}')\n")
print(' Note : Vous pouvez également déposer le dossier datasets directement dans votre home : ~/datasets\n\n')
assert False, 'datasets folder not found : Abort all.'
# -------------------------------------------------------------
# shuffle_dataset
# -------------------------------------------------------------
#
def shuffle_np_dataset(x, y):
"""
Shuffle a dataset (x,y)
args:
x,y : dataset
return:
x,y mixed
"""
assert (len(x) == len(y)), "x and y must have same size"
p = np.random.permutation(len(x))
return x[p], y[p]
def update_progress(what,i,imax, redraw=False):
"""
Display a text progress bar, as :
My progress bar : ############# 34%
args:
what : Progress bas name
i : Current progress
imax : Max value for i
return:
nothing
"""
bar_length = min(40,imax)
if (i%int(imax/bar_length))!=0 and i<imax and not redraw:
return
progress = float(i/imax)
block = int(round(bar_length * progress))
endofline = '\r' if progress<1 else '\n'
text = "{:16s} [{}] {:>5.1f}% of {}".format( what, "#"*block+"-"*(bar_length-block), progress*100, imax)
print(text, end=endofline)
def rmax(l):
"""
Recursive max() for a given iterable of iterables
Should be np.array of np.array or list of list, etc.
args:
l : Iterable of iterables
return:
max value
"""
maxi = float('-inf')
for item in l:
if isinstance(item, Iterable):
t = rmax(item)
else:
t = item
if t > maxi:
maxi = t
return maxi
def rmin(l):
"""
Recursive min() for a given iterable of iterables
Should be np.array of np.array or list of list, etc.
args:
l : Iterable of iterables
return:
min value
"""
mini = float('inf')
for item in l:
if isinstance(item, Iterable):
t = rmin(item)
else:
t = item
if t < mini:
mini = t
return mini
# -------------------------------------------------------------
# show_images
# -------------------------------------------------------------
#
def plot_images(x,y=None, indices='all', columns=12, x_size=1, y_size=1,
colorbar=False, y_pred=None, cm='binary',y_padding=0.35, spines_alpha=1,
fontsize=20, save_as='auto'):
"""
Show some images in a grid, with legends
args:
x : images - Shapes must be (-1,lx,ly) (-1,lx,ly,1) or (-1,lx,ly,3)
y : real classes or labels or None (None)
indices : indices of images to show or None for all (None)
columns : number of columns (12)
x_size,y_size : figure size (1), (1)
colorbar : show colorbar (False)
y_pred : predicted classes (None)
cm : Matplotlib color map (binary)
y_padding : Padding / rows (0.35)
spines_alpha : Spines alpha (1.)
font_size : Font size in px (20)
save_as : Filename to use if save figs is enable ('auto')
returns:
nothing
"""
if indices=='all': indices=range(len(x))
draw_labels = (y is not None)
draw_pred = (y_pred is not None)
rows = math.ceil(len(indices)/columns)
fig=plt.figure(figsize=(columns*x_size, rows*(y_size+y_padding)))
n=1
for i in indices:
axs=fig.add_subplot(rows, columns, n)
n+=1
# ---- Shape is (lx,ly)
if len(x[i].shape)==2:
xx=x[i]
# ---- Shape is (lx,ly,n)
if len(x[i].shape)==3:
(lx,ly,lz)=x[i].shape
if lz==1:
xx=x[i].reshape(lx,ly)
else:
xx=x[i]
img=axs.imshow(xx, cmap = cm, interpolation='lanczos')
axs.spines['right'].set_visible(True)
axs.spines['left'].set_visible(True)
axs.spines['top'].set_visible(True)
axs.spines['bottom'].set_visible(True)
axs.spines['right'].set_alpha(spines_alpha)
axs.spines['left'].set_alpha(spines_alpha)
axs.spines['top'].set_alpha(spines_alpha)
axs.spines['bottom'].set_alpha(spines_alpha)
axs.set_yticks([])
axs.set_xticks([])
if draw_labels and not draw_pred:
axs.set_xlabel(y[i],fontsize=fontsize)
if draw_labels and draw_pred:
if y[i]!=y_pred[i]:
axs.set_xlabel(f'{y_pred[i]} ({y[i]})',fontsize=fontsize)
axs.xaxis.label.set_color('red')
else:
axs.set_xlabel(y[i],fontsize=fontsize)
if colorbar:
fig.colorbar(img,orientation="vertical", shrink=0.65)
save_fig(save_as)
plt.show()
def plot_image(x,cm='binary', figsize=(4,4),save_as='auto'):
"""
Draw a single image.
Image shape can be (lx,ly), (lx,ly,1) or (lx,ly,n)
args:
x : image as np array
cm : color map ('binary')
figsize : fig size (4,4)
"""
# ---- Shape is (lx,ly)
if len(x.shape)==2:
xx=x
# ---- Shape is (lx,ly,n)
if len(x.shape)==3:
(lx,ly,lz)=x.shape
if lz==1:
xx=x.reshape(lx,ly)
else:
xx=x
# ---- Draw it
plt.figure(figsize=figsize)
plt.imshow(xx, cmap = cm, interpolation='lanczos')
save_fig(save_as)
plt.show()
# -------------------------------------------------------------
# show_history
# -------------------------------------------------------------
#
def plot_history(history, figsize=(8,6),
plot={"Accuracy":['accuracy','val_accuracy'], 'Loss':['loss', 'val_loss']},
save_as='auto'):
"""
Show history
args:
history: history
figsize: fig size
plot: list of data to plot : {<title>:[<metrics>,...], ...}
"""
fig_id=0
for title,curves in plot.items():
plt.figure(figsize=figsize)
plt.title(title)
plt.ylabel(title)
plt.xlabel('Epoch')
for c in curves:
plt.plot(history.history[c])
plt.legend(curves, loc='upper left')
if save_as=='auto':
figname='auto'
else:
figname=f'{save_as}_{fig_id}'
fig_id+=1
save_fig(figname)
plt.show()
# -------------------------------------------------------------
# plot_confusion_matrix
# -------------------------------------------------------------
# Bug in Matplotlib 3.1.1
#
def plot_confusion_matrix(cm,
title='Confusion matrix',
figsize=(12,8),
cmap="gist_heat_r",
vmin=0,
vmax=1,
xticks=5,yticks=5,
annot=True,
save_as='auto'):
"""
given a sklearn confusion matrix (cm), make a nice plot
Note:bug in matplotlib 3.1.1
Args:
cm: confusion matrix from sklearn.metrics.confusion_matrix
title: the text to display at the top of the matrix
figsize: Figure size (12,8)
cmap: color map (gist_heat_r)
vmi,vmax: Min/max 0 and 1
annot: Annotation or just colors (True)
"""
accuracy = np.trace(cm) / float(np.sum(cm))
misclass = 1 - accuracy
plt.figure(figsize=figsize)
sn.heatmap(cm, linewidths=1, linecolor="#ffffff",square=True,
cmap=cmap, xticklabels=xticks, yticklabels=yticks,
vmin=vmin,vmax=vmax,annot=annot)
plt.ylabel('True label')
plt.xlabel('Predicted label\naccuracy={:0.4f}; misclass={:0.4f}'.format(accuracy, misclass))
save_fig(save_as)
plt.show()
def display_confusion_matrix(y_true,y_pred,labels=None,color='green',
font_size='12pt', title="#### Confusion matrix is :"):
"""
Show a confusion matrix for a predictions.
see : sklearn.metrics.confusion_matrix
Args:
y_true Real classes
y_pred Predicted classes
labels List of classes to show in the cm
color: Color for the palette (green)
font_size: Values font size
title: the text to display at the top of the matrix
"""
assert (labels!=None),"Label must be set"
if title != None : display(Markdown(title))
cm = confusion_matrix( y_true,y_pred, normalize="true", labels=labels)
df=pd.DataFrame(cm)
cmap = sn.light_palette(color, as_cmap=True)
df.style.set_properties(**{'font-size': '20pt'})
display(df.style.format('{:.2f}') \
.background_gradient(cmap=cmap)
.set_properties(**{'font-size': font_size}))
def plot_donut(values, labels, colors=["lightsteelblue","coral"], figsize=(6,6), title=None, save_as='auto'):
"""
Draw a donut
args:
values : list of values
labels : list of labels
colors : list of color (["lightsteelblue","coral"])
figsize : size of figure ( (6,6) )
return:
nothing
"""
# ---- Title or not
if title != None : display(Markdown(title))
# ---- Donut
plt.figure(figsize=figsize)
# ---- Draw a pie chart..
plt.pie(values, labels=labels,
colors = colors, autopct='%1.1f%%', startangle=70, pctdistance=0.85,
textprops={'fontsize': 18},
wedgeprops={"edgecolor":"w",'linewidth': 5, 'linestyle': 'solid', 'antialiased': True})
# ---- ..with a white circle
circle = plt.Circle((0,0),0.70,fc='white')
ax = plt.gca()
ax.add_artist(circle)
# Equal aspect ratio ensures that pie is drawn as a circle
plt.axis('equal')
plt.tight_layout()
save_fig(save_as)
plt.show()
def plot_multivariate_serie(sequence, labels=None, predictions=None, only_features=None,
columns=3, width=5,height=4,wspace=0.3,hspace=0.2,
save_as='auto', time_dt=1):
sequence_len = len(sequence)
features_len = sequence.shape[1]
if only_features is None : only_features=range(features_len)
if labels is None : labels=range(features_len)
t = np.arange(sequence_len)
if predictions is None:
dt = 0
else:
dt = len(predictions)
sequence_with_pred = sequence.copy()
sequence_with_pred[-dt:]=predictions
rows = math.ceil(features_len/columns)
fig = plt.figure(figsize=(columns*width, rows*height))
fig.subplots_adjust(wspace=0.3,hspace=0.2)
n=1
for i in only_features:
ax=fig.add_subplot(rows, columns, n)
ax.plot(t[:-dt], sequence[:-dt,i], '-', linewidth=1, color='steelblue', label=labels[i])
ax.plot(t[:-dt], sequence[:-dt,i], 'o', markersize=4, color='steelblue')
ax.plot(t[-dt-1:], sequence[-dt-1:,i],'--o', linewidth=1, fillstyle='none', markersize=6, color='steelblue')
if predictions is not None:
ax.plot(t[-dt-1:], sequence_with_pred[-dt-1:,i], '--', linewidth=1, fillstyle='full', markersize=6, color='red')
ax.plot(t[-dt:], predictions[:,i], 'o', linewidth=1, fillstyle='full', markersize=6, color='red')
ax.legend(loc="upper left")
n+=1
save_fig(save_as)
plt.show()
def set_save_fig(save=True, figs_dir='./figs', figs_name='fig_', figs_id=0):
"""
Set save_fig parameters
Default figs name is <figs_name><figs_id>.{png|svg}
args:
save : Boolean, True to save figs (True)
figs_dir : Path to save figs (./figs)
figs_name : Default basename for figs (figs_)
figs_id : Start id for figs name (0)
"""
global _save_figs, _figs_dir, _figs_name, _figs_id
_save_figs = save
_figs_dir = figs_dir
_figs_name = figs_name
_figs_id = figs_id
print(f'Save figs : {_save_figs}')
print(f'Path figs : {_figs_dir}')
def save_fig(filename='auto', png=True, svg=False):
"""
Save current figure
args:
filename : Image filename ('auto')
png : Boolean. Save as png if True (True)
svg : Boolean. Save as svg if True (False)
"""
global _save_figs, _figs_dir, _figs_name, _figs_id
if not _save_figs : return
mkdir(_figs_dir)
if filename=='auto':
path=f'{_figs_dir}/{_figs_name}{_figs_id:02d}'
else:
path=f'{_figs_dir}/{filename}'
if png : plt.savefig( f'{path}.png')
if svg : plt.savefig( f'{path}.png')
if filename=='auto': _figs_id+=1
def subtitle(t):
display(Markdown(f'<br>**{t}**'))
def display_md(md_text):
display(Markdown(md_text))
def display_img(img):
display(Image(img))
def hdelay(sec):
return str(datetime.timedelta(seconds=int(sec)))
def hsize(num, suffix='o'):
for unit in ['','K','M','G','T','P','E','Z']:
if abs(num) < 1024.0:
return f'{num:3.1f} {unit}{suffix}'
num /= 1024.0
return f'{num:.1f} Y{suffix}'
def load_cssfile(cssfile):
if cssfile is None: return
styles = open(cssfile, "r").read()
display(HTML(styles))
def np_print(*args, format={'float': '{:6.3f}'.format}):
with np.printoptions(formatter=format):
for a in args:
print(a)
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment