Skip to content
Snippets Groups Projects
Commit fdef5d0f authored by Jean-Luc Parouty's avatar Jean-Luc Parouty
Browse files

Update Fidle a distance

parent f8875cad
No related branches found
No related tags found
No related merge requests found
%% Cell type:code id: tags:
``` python
from IPython.display import display,Markdown
display(Markdown(open('README.md', 'r').read()))
#
# This README is visible under Jupiter LAb ! :-)
```
%% Output
<a name="top"></a>
[<img width="600px" src="fidle/img/00-Fidle-titre-01.svg"></img>](#top)
<!-- --------------------------------------------------- -->
<!-- To correctly view this README under Jupyter Lab -->
<!-- Open the notebook: README.ipynb! -->
<!-- --------------------------------------------------- -->
## Session Fidle à distance (NEW !)
## A propos
Faute de pouvoir organiser des sessions en présentiel,\
nous vous proposons une **session à distance** :-)
**- Prochain rendez-vous -**
|[<img width="100px" src="fidle/img/00-Fidle-a-distance-01.svg"></img>](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Fidle%20%C3%A0%20distance/En%20bref)<br>**Jeudi 4 févier, 14h :**<br>Episode 1 : **Introduction du cycle, Historique et concepts fondamentaux**|
|:---|
|Fonction de perte - Descente de gradient - Optimisation - Hyperparamètres<br>Préparation des données - Apprentissage - Validation - Sous et sur apprentissage<br>Fonctions d’activation - softmax<br>Travaux pratiques : Régression et Classification avec des DNN|
|Durée : 3h - Paramètres de diffusion précisés 2 jours avant|
A propos de **[Fidle à distance](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Fidle%20%C3%A0%20distance/En%20bref)**\
Voir le [programme](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Fidle%20%C3%A0%20distance/Pr%C3%A9sentation#programme-)
## About Fidle
This repository contains all the documents and links of the **Fidle Training** .
Fidle (for Formation Introduction au Deep Learning) is a 2-day training session
co-organized by the Formation Permanente CNRS and the SARI and DEVLOG networks.
co-organized by the Formation Permanente CNRS and the Resinfo/SARI and DevLOG CNRS networks.
The objectives of this training are :
- Understanding the **bases of Deep Learning** neural networks
- Develop a **first experience** through simple and representative examples
- Understanding **Tensorflow/Keras** and **Jupyter lab** technologies
- Apprehend the **academic computing environments** Tier-2 or Tier-1 with powerfull GPU
For more information, you can contact us at :
[<img width="200px" style="vertical-align:middle" src="fidle/img/00-Mail_contact.svg"></img>](#top)
Current Version : <!-- VERSION_BEGIN -->
2.0.1
<!-- VERSION_END -->
## Course materials
| | | |
|:--:|:--:|:--:|
| **[<img width="50px" src="fidle/img/00-Fidle-pdf.svg"></img><br>Course slides](https://cloud.univ-grenoble-alpes.fr/index.php/s/wxCztjYBbQ6zwd6)**<br>The course in pdf format<br>(12 Mo)| **[<img width="50px" src="fidle/img/00-Notebooks.svg"></img><br>Notebooks](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/archive/master/fidle-master.zip)**<br> &nbsp;&nbsp;&nbsp;&nbsp;Get a Zip or clone this repository &nbsp;&nbsp;&nbsp;&nbsp;<br>(10 Mo)| **[<img width="50px" src="fidle/img/00-Datasets-tar.svg"></img><br>Datasets](https://cloud.univ-grenoble-alpes.fr/index.php/s/wxCztjYBbQ6zwd6)**<br>All the needed datasets<br>(1.2 Go)|
Have a look about **[How to get and install](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Install-Fidle)** these notebooks and datasets.
## Jupyter notebooks
<!-- INDEX_BEGIN -->
### Linear and logistic regression
- **[LINR1](LinearReg/01-Linear-Regression.ipynb)** - [Linear regression with direct resolution](LinearReg/01-Linear-Regression.ipynb)
Low-level implementation, using numpy, of a direct resolution for a linear regression
- **[GRAD1](LinearReg/02-Gradient-descent.ipynb)** - [Linear regression with gradient descent](LinearReg/02-Gradient-descent.ipynb)
Low level implementation of a solution by gradient descent. Basic and stochastic approach.
- **[POLR1](LinearReg/03-Polynomial-Regression.ipynb)** - [Complexity Syndrome](LinearReg/03-Polynomial-Regression.ipynb)
Illustration of the problem of complexity with the polynomial regression
- **[LOGR1](LinearReg/04-Logistic-Regression.ipynb)** - [Logistic regression](LinearReg/04-Logistic-Regression.ipynb)
Simple example of logistic regression with a sklearn solution
### Perceptron Model 1957
- **[PER57](IRIS/01-Simple-Perceptron.ipynb)** - [Perceptron Model 1957](IRIS/01-Simple-Perceptron.ipynb)
Example of use of a Perceptron, with sklearn and IRIS dataset of 1936 !
### Basic regression using DNN
- **[BHPD1](BHPD/01-DNN-Regression.ipynb)** - [Regression with a Dense Network (DNN)](BHPD/01-DNN-Regression.ipynb)
Simple example of a regression with the dataset Boston Housing Prices Dataset (BHPD)
- **[BHPD2](BHPD/02-DNN-Regression-Premium.ipynb)** - [Regression with a Dense Network (DNN) - Advanced code](BHPD/02-DNN-Regression-Premium.ipynb)
A more advanced implementation of the precedent example
### Basic classification using a DNN
- **[MNIST1](MNIST/01-DNN-MNIST.ipynb)** - [Simple classification with DNN](MNIST/01-DNN-MNIST.ipynb)
An example of classification using a dense neural network for the famous MNIST dataset
### Images classification with Convolutional Neural Networks (CNN)
- **[GTSRB1](GTSRB/01-Preparation-of-data.ipynb)** - [Dataset analysis and preparation](GTSRB/01-Preparation-of-data.ipynb)
Episode 1 : Analysis of the GTSRB dataset and creation of an enhanced dataset
- **[GTSRB2](GTSRB/02-First-convolutions.ipynb)** - [First convolutions](GTSRB/02-First-convolutions.ipynb)
Episode 2 : First convolutions and first classification of our traffic signs
- **[GTSRB3](GTSRB/03-Tracking-and-visualizing.ipynb)** - [Training monitoring](GTSRB/03-Tracking-and-visualizing.ipynb)
Episode 3 : Monitoring, analysis and check points during a training session
- **[GTSRB4](GTSRB/04-Data-augmentation.ipynb)** - [Data augmentation ](GTSRB/04-Data-augmentation.ipynb)
Episode 4 : Adding data by data augmentation when we lack it, to improve our results
- **[GTSRB5](GTSRB/05-Full-convolutions.ipynb)** - [Full convolutions](GTSRB/05-Full-convolutions.ipynb)
Episode 5 : A lot of models, a lot of datasets and a lot of results.
- **[GTSRB6](GTSRB/06-Notebook-as-a-batch.ipynb)** - [Full convolutions as a batch](GTSRB/06-Notebook-as-a-batch.ipynb)
Episode 6 : To compute bigger, use your notebook in batch mode
- **[GTSRB7](GTSRB/07-Show-report.ipynb)** - [Batch reports](GTSRB/07-Show-report.ipynb)
Episode 7 : Displaying our jobs report, and the winner is...
- **[GTSRB10](GTSRB/batch_oar.sh)** - [OAR batch script submission](GTSRB/batch_oar.sh)
Bash script for an OAR batch submission of an ipython code
- **[GTSRB11](GTSRB/batch_slurm.sh)** - [SLURM batch script](GTSRB/batch_slurm.sh)
Bash script for a Slurm batch submission of an ipython code
### Sentiment analysis with word embedding
- **[IMDB1](IMDB/01-Embedding-Keras.ipynb)** - [Sentiment alalysis with text embedding](IMDB/01-Embedding-Keras.ipynb)
A very classical example of word embedding with a dataset from Internet Movie Database (IMDB)
- **[IMDB2](IMDB/02-Prediction.ipynb)** - [Reload and reuse a saved model](IMDB/02-Prediction.ipynb)
Retrieving a saved model to perform a sentiment analysis (movie review)
- **[IMDB3](IMDB/03-LSTM-Keras.ipynb)** - [Sentiment analysis with a LSTM network](IMDB/03-LSTM-Keras.ipynb)
Still the same problem, but with a network combining embedding and LSTM
### Time series with Recurrent Neural Network (RNN)
- **[SYNOP1](SYNOP/01-Preparation-of-data.ipynb)** - [Preparation of data](SYNOP/01-Preparation-of-data.ipynb)
Episode 1 : Data analysis and preparation of a meteorological dataset (SYNOP)
- **[SYNOP2](SYNOP/02-First-predictions.ipynb)** - [First predictions at 3h](SYNOP/02-First-predictions.ipynb)
Episode 2 : Learning session and weather prediction attempt at 3h
- **[SYNOP3](SYNOP/03-12h-predictions.ipynb)** - [12h predictions](SYNOP/03-12h-predictions.ipynb)
Episode 3: Attempt to predict in a more longer term
### Unsupervised learning with an autoencoder neural network (AE)
- **[AE1](AE/01-AE-with-MNIST.ipynb)** - [Building and training an AE denoiser model](AE/01-AE-with-MNIST.ipynb)
Episode 1 : After construction, the model is trained with noisy data from the MNIST dataset.
- **[AE2](AE/02-AE-with-MNIST-post.ipynb)** - [Exploring our denoiser model](AE/02-AE-with-MNIST-post.ipynb)
Episode 2 : Using the previously trained autoencoder to denoise data
### Generative network with Variational Autoencoder (VAE)
- **[VAE1](VAE/01-VAE-with-MNIST.ipynb)** - [First VAE, with a small dataset (MNIST)](VAE/01-VAE-with-MNIST.ipynb)
Construction and training of a VAE with a latent space of small dimension.
- **[VAE2](VAE/02-VAE-with-MNIST-post.ipynb)** - [Analysis of the associated latent space](VAE/02-VAE-with-MNIST-post.ipynb)
Visualization and analysis of the VAE's latent space
- **[VAE5](VAE/05-About-CelebA.ipynb)** - [Another game play : About the CelebA dataset](VAE/05-About-CelebA.ipynb)
Episode 1 : Presentation of the CelebA dataset and problems related to its size
- **[VAE6](VAE/06-Prepare-CelebA-datasets.ipynb)** - [Generation of a clustered dataset](VAE/06-Prepare-CelebA-datasets.ipynb)
Episode 2 : Analysis of the CelebA dataset and creation of an clustered and usable dataset
- **[VAE7](VAE/07-Check-CelebA.ipynb)** - [Checking the clustered dataset](VAE/07-Check-CelebA.ipynb)
Episode : 3 Clustered dataset verification and testing of our datagenerator
- **[VAE8](VAE/08-VAE-with-CelebA.ipynb)** - [Training session for our VAE](VAE/08-VAE-with-CelebA.ipynb)
Episode 4 : Training with our clustered datasets in notebook or batch mode
- **[VAE9](VAE/09-VAE-withCelebA-post.ipynb)** - [Data generation from latent space](VAE/09-VAE-withCelebA-post.ipynb)
Episode 5 : Exploring latent space to generate new data
- **[VAE10](VAE/batch_slurm.sh)** - [SLURM batch script](VAE/batch_slurm.sh)
Bash script for SLURM batch submission of VAE8 notebooks
### Miscellaneous
- **[ACTF1](Misc/Activation-Functions.ipynb)** - [Activation functions](Misc/Activation-Functions.ipynb)
Some activation functions, with their derivatives.
- **[NP1](Misc/Numpy.ipynb)** - [A short introduction to Numpy](Misc/Numpy.ipynb)
Numpy is an essential tool for the Scientific Python.
- **[TSB1](Misc/Using-Tensorboard.ipynb)** - [Tensorboard with/from Jupyter ](Misc/Using-Tensorboard.ipynb)
4 ways to use Tensorboard from the Jupyter environment
<!-- INDEX_END -->
## Installation
A procedure for **configuring** and **starting Jupyter** is available in the **[Wiki](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Install-Fidle)**.
## Licence
[<img width="100px" src="fidle/img/00-fidle-CC BY-NC-SA.svg"></img>](https://creativecommons.org/licenses/by-nc-sa/4.0/)
\[en\] Attribution - NonCommercial - ShareAlike 4.0 International (CC BY-NC-SA 4.0)
\[Fr\] Attribution - Pas d’Utilisation Commerciale - Partage dans les Mêmes Conditions 4.0 International
See [License](https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode).
See [Disclaimer](https://creativecommons.org/licenses/by-nc-sa/4.0/#).
----
[<img width="80px" src="fidle/img/00-Fidle-logo-01.svg"></img>](#top)
......
......@@ -7,12 +7,27 @@
<!-- Open the notebook: README.ipynb! -->
<!-- --------------------------------------------------- -->
## Session Fidle à distance (NEW !)
## A propos
Faute de pouvoir organiser des sessions en présentiel,\
nous vous proposons une **session à distance** :-)
**- Prochain rendez-vous -**
|[<img width="100px" src="fidle/img/00-Fidle-a-distance-01.svg"></img>](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Fidle%20%C3%A0%20distance/En%20bref)<br>**Jeudi 4 févier, 14h :**<br>Episode 1 : **Introduction du cycle, Historique et concepts fondamentaux**|
|:---|
|Fonction de perte - Descente de gradient - Optimisation - Hyperparamètres<br>Préparation des données - Apprentissage - Validation - Sous et sur apprentissage<br>Fonctions d’activation - softmax<br>Travaux pratiques : Régression et Classification avec des DNN|
|Durée : 3h - Paramètres de diffusion précisés 2 jours avant|
A propos de **[Fidle à distance](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Fidle%20%C3%A0%20distance/En%20bref)**\
Voir le [programme](https://gricad-gitlab.univ-grenoble-alpes.fr/talks/fidle/-/wikis/Fidle%20%C3%A0%20distance/Pr%C3%A9sentation#programme-)
## About Fidle
This repository contains all the documents and links of the **Fidle Training** .
Fidle (for Formation Introduction au Deep Learning) is a 2-day training session
co-organized by the Formation Permanente CNRS and the SARI and DEVLOG networks.
co-organized by the Formation Permanente CNRS and the Resinfo/SARI and DevLOG CNRS networks.
The objectives of this training are :
- Understanding the **bases of Deep Learning** neural networks
......
......@@ -14,7 +14,7 @@
# ---- Version -----------------------------------------------------
#
VERSION = '2.0.1'
VERSION = '2.0.2'
# ---- Default notebook name ---------------------------------------
#
......
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 151.0917 45.005"><path d="M47.9919,32.977a13.424,13.424,0,0,0,5.4811-1.8386,6.7273,6.7273,0,0,1,2.455-.96,15.2069,15.2069,0,0,0-5.9862-17.0857A17.7216,17.7216,0,0,0,37.6058,11.01c-4.3345.7164-8.8269,3.996-10.5862,5.4673C25.961,17.3639,13.369,29.3005,8.1168,26.9216c-3.5532-1.61,2.7909-7.4675-.1189-12.82a.2323.2323,0,0,0-.3874-.0258c-1.4813,1.9984-2.9293,4.3968-4.9019,3.32-.8812-.4812-1.6744-2.0178-2.2858-2.99A.23.23,0,0,0,0,14.5371,24.26,24.26,0,0,0,6.0983,29.3426c4.5289,5.4189,12.465,11.7291,25.2885,13.0059,5.5522.5529,18.7217-1.1976,23.9833-10.6647a13.2741,13.2741,0,0,0-1.2693.63,14.7716,14.7716,0,0,1-5.9875,1.9915c-.1831.0169-.3649.0245-.5466.0245a10.5714,10.5714,0,0,1-5.9687-2.3927,1.1184,1.1184,0,1,1,.8549-1.0851c0,.0183-.0044.0353-.0057.0535C43.53,31.7328,45.7847,33.1928,47.9919,32.977ZM31.2094,37.9323a20.3764,20.3764,0,0,1-4.7961.8712c-1.0832.0006-1.5335-.307-1.748-.768-.5643-1.2134,1.4687-2.9677,3.272-4.2263A.6668.6668,0,1,1,28.7,34.903a10.991,10.991,0,0,0-2.7544,2.5318c.3523.0761,1.4964.1245,4.9176-.7913a.6672.6672,0,0,1,.3459,1.2888Zm15.45-16.2541a2.5468,2.5468,0,0,1,2.4726,2.4538,1.6639,1.6639,0,1,0-1.4731,2.4317,1.7278,1.7278,0,0,0,.3088-.0308,2.37,2.37,0,0,1-1.3083.4025,2.6324,2.6324,0,0,1,0-5.2572ZM38.0706,4.6089a1.3336,1.3336,0,0,0,.524,1.8116c.6453.3553,2.0046-.4177,2.8292.7346.4284.5988-.8963-2.7147-1.5417-3.0708A1.3328,1.3328,0,0,0,38.0706,4.6089Zm6.7939.1428c-1.6619.9743-1.97,5.0031-1.5417,4.4043A7.584,7.584,0,0,1,46.152,7.0878a1.3337,1.3337,0,0,0-1.2875-2.3361ZM43.1787.31c-.85.9831-.2679,3.5325-.1157,3.0651a5.4212,5.4212,0,0,1,1.3687-1.926A.8741.8741,0,0,0,44.41.2135.8656.8656,0,0,0,43.1787.31Z" style="fill:#e12229"/><path d="M61.1535,29.2182H77.9224a2.222,2.222,0,0,0,2.2191-2.2224V16.0064a2.2242,2.2242,0,0,0-2.2191-2.2224H61.1535a2.222,2.222,0,0,0-2.2191,2.2224V26.9991A2.2234,2.2234,0,0,0,61.1535,29.2182Z" style="fill:#e12229"/><path d="M86.4244,26.81V16.1919a.4979.4979,0,0,0-.245-.4273h0a.4955.4955,0,0,0-.4935-.0033l-4.5144,2.56v6.3592l4.511,2.56a.4935.4935,0,0,0,.7419-.4306Z" style="fill:#e12229"/><path d="M97.3754,10.3981h8.3535v1.0034H98.58v6.7231h6.5732V19.128H98.58v8.1778H97.3754Z" style="fill:#808285"/><path d="M110.1683,10.3981V27.3058h-1.2041V10.3981Z" style="fill:#808285"/><path d="M114.2572,10.649a25.2147,25.2147,0,0,1,4.34-.3764c3.086,0,5.419.8281,6.8233,2.3081a8.0594,8.0594,0,0,1,2.1074,5.87,9.2468,9.2468,0,0,1-2.207,6.3965c-1.4805,1.6308-4.0391,2.584-7.3,2.584a36.8491,36.8491,0,0,1-3.7637-.15Zm1.2041,15.6533a20.8778,20.8778,0,0,0,2.7344.1255c5.4189,0,8.1025-3.0351,8.1025-7.9272.0508-4.29-2.333-7.2246-7.7763-7.2246a15.3662,15.3662,0,0,0-3.0606.2759Z" style="fill:#808285"/><path d="M130.4867,10.3981h1.2041V26.3023h7.5264v1.0035h-8.73Z" style="fill:#808285"/><path d="M149.6771,18.902h-6.748v7.4H150.48v1.0035H141.725V10.3981h8.3535v1.0034h-7.1494v6.497h6.748Z" style="fill:#808285"/><path d="M60.4984,43.8248a3.5916,3.5916,0,0,0,1.9185.5752,2.0111,2.0111,0,0,0,2.228-2.0508c0-1.0918-.5757-1.7558-1.8149-2.3315-1.24-.5161-2.3462-1.3428-2.3462-2.7442A2.5211,2.5211,0,0,1,63.1986,34.78a3.2758,3.2758,0,0,1,1.771.4424l-.2656.5757A2.8766,2.8766,0,0,0,63.14,35.37a1.7827,1.7827,0,0,0-1.977,1.771c0,1.1358.6343,1.667,1.9033,2.2574,1.4756.7231,2.2578,1.5048,2.2578,2.892a2.6758,2.6758,0,0,1-2.9512,2.7,3.9334,3.9334,0,0,1-2.1245-.6049Z" style="fill:#808285"/><path d="M67.0062,41.1837c0,2.4492,1.1656,3.2163,2.4195,3.2163a3.3738,3.3738,0,0,0,1.7119-.3686l.1621.5312a3.975,3.975,0,0,1-1.9624.4277c-1.9033,0-3.01-1.4755-3.01-3.5708,0-2.3315,1.1948-3.792,2.8623-3.792,2.0508,0,2.4644,2.0362,2.4644,3.128a3.6318,3.6318,0,0,1-.0152.4282Zm3.9395-.5464c.0293-1.21-.4868-2.4346-1.8443-2.4346-1.3427,0-1.9477,1.3282-2.0659,2.4346Z" style="fill:#808285"/><path d="M72.7318,43.9425A2.6116,2.6116,0,0,0,74.163,44.4c1.0034,0,1.52-.5752,1.52-1.3276,0-.7378-.3838-1.166-1.3281-1.6084-1.0474-.4873-1.6968-1.0772-1.6968-1.9624a1.9052,1.9052,0,0,1,2.0508-1.8741,2.5888,2.5888,0,0,1,1.4463.4278l-.28.56a2.0951,2.0951,0,0,0-1.2544-.3979,1.176,1.176,0,0,0-1.2837,1.18c0,.6933.4131,1.0034,1.2983,1.4458,1.0035.4575,1.7266,1.0332,1.7266,2.11A2.0192,2.0192,0,0,1,74.119,44.99a3.11,3.11,0,0,1-1.6528-.4721Z" style="fill:#808285"/><path d="M77.4691,43.9425A2.6116,2.6116,0,0,0,78.9,44.4c1.0034,0,1.52-.5752,1.52-1.3276,0-.7378-.3838-1.166-1.3281-1.6084-1.0474-.4873-1.6968-1.0772-1.6968-1.9624a1.9052,1.9052,0,0,1,2.0508-1.8741,2.5885,2.5885,0,0,1,1.4463.4278l-.28.56a2.0951,2.0951,0,0,0-1.2544-.3979,1.176,1.176,0,0,0-1.2837,1.18c0,.6933.4131,1.0034,1.2983,1.4458,1.0035.4575,1.7266,1.0332,1.7266,2.11A2.0192,2.0192,0,0,1,78.8563,44.99a3.11,3.11,0,0,1-1.6528-.4721Z" style="fill:#808285"/><path d="M82.8109,36.4029a.5516.5516,0,0,1-.5166-.59.565.565,0,0,1,.5459-.59.5928.5928,0,0,1-.0146,1.1806Zm-.3247,8.4546V37.775h.6787v7.0825Z" style="fill:#808285"/><path d="M87.4442,45.005c-1.5932,0-2.8916-1.313-2.8916-3.6445,0-2.4791,1.4312-3.733,2.9951-3.733,1.6675,0,2.9068,1.3428,2.9068,3.6446,0,2.6264-1.5786,3.7329-2.9951,3.7329Zm.0445-.5752c1.3872,0,2.2724-1.52,2.2724-3.1431,0-1.2392-.5757-3.084-2.2578-3.084-1.623,0-2.2573,1.712-2.2573,3.1285,0,1.5786.8554,3.0986,2.228,3.0986Z" style="fill:#808285"/><path d="M91.8275,39.4571c0-.7231-.0293-1.1362-.0591-1.6821h.6348l.0439,1.1655h.03a2.3426,2.3426,0,0,1,2.169-1.313c.708,0,2.2134.3985,2.2134,2.7886v4.4414h-.6788V40.5343c0-1.1953-.3984-2.3164-1.7265-2.3164a1.9956,1.9956,0,0,0-1.874,1.5786,2.7894,2.7894,0,0,0-.0738.6343v4.4267h-.6787Z" style="fill:#808285"/><path d="M105.389,43.2052a12.1551,12.1551,0,0,0,.0884,1.6523h-.62l-.0884-.9443h-.0444a2.2773,2.2773,0,0,1-1.9771,1.0918,1.8489,1.8489,0,0,1-1.9184-1.9478c0-1.6523,1.3872-2.582,3.8808-2.5674v-.2065c0-.8555-.1479-2.0806-1.6382-2.0654a2.6628,2.6628,0,0,0-1.5493.4721l-.1914-.5019a3.3065,3.3065,0,0,1,1.8443-.5606c1.7114,0,2.2133,1.2393,2.2133,2.6412ZM102.3934,34.75l1.2837,2.0068h-.5161L101.4345,34.75Zm2.3169,6.3154c-1.24-.03-3.1723.1475-3.1723,1.9034a1.2775,1.2775,0,0,0,1.2837,1.4609,1.85,1.85,0,0,0,1.8295-1.4019A1.5739,1.5739,0,0,0,104.71,42.6Z" style="fill:#808285"/><path d="M114.9364,34.4552v8.8086c0,.4721.03,1.166.0591,1.5937h-.605l-.0444-1.2246h-.044a2.3579,2.3579,0,0,1-2.2134,1.3721c-1.52,0-2.7294-1.3428-2.7294-3.5855,0-2.42,1.3129-3.792,2.8476-3.792a2.1763,2.1763,0,0,1,2.0215,1.1509h.0293V34.4552Zm-.6787,6.0791a2.8863,2.8863,0,0,0-.0591-.605,2.0347,2.0347,0,0,0-1.9477-1.7266c-1.4014,0-2.1983,1.4019-2.1983,3.1578,0,1.5937.6787,3.0693,2.1543,3.0693a2.0811,2.0811,0,0,0,1.9917-1.771,2.2467,2.2467,0,0,0,.0591-.5606Z" style="fill:#808285"/><path d="M117.1508,36.4029a.5516.5516,0,0,1-.5166-.59.565.565,0,0,1,.5459-.59.5928.5928,0,0,1-.0147,1.1806Zm-.3247,8.4546V37.775h.6787v7.0825Z" style="fill:#808285"/><path d="M119.1283,43.9425A2.6116,2.6116,0,0,0,120.56,44.4c1.0034,0,1.52-.5752,1.52-1.3276,0-.7378-.3838-1.166-1.3282-1.6084-1.0473-.4873-1.6967-1.0772-1.6967-1.9624a1.9052,1.9052,0,0,1,2.0508-1.8741,2.5882,2.5882,0,0,1,1.4462.4278l-.28.56a2.0951,2.0951,0,0,0-1.2544-.3979,1.176,1.176,0,0,0-1.2837,1.18c0,.6933.4131,1.0034,1.2983,1.4458,1.0034.4575,1.7266,1.0332,1.7266,2.11a2.0192,2.0192,0,0,1-2.2427,2.0361,3.11,3.11,0,0,1-1.6528-.4721Z" style="fill:#808285"/><path d="M125.0756,36.2257V37.775h1.8149v.5606h-1.8149v4.6625c0,.93.2949,1.4019.9589,1.4019a1.9224,1.9224,0,0,0,.6788-.0884l.0888.5313a2.2093,2.2093,0,0,1-.8706.1474,1.3616,1.3616,0,0,1-1.1362-.4721,2.5964,2.5964,0,0,1-.3984-1.6822v-4.5H123.29V37.775h1.107V36.506Z" style="fill:#808285"/><path d="M132.0258,43.2052a12.1552,12.1552,0,0,0,.0883,1.6523h-.62l-.0884-.9443h-.0444a2.2773,2.2773,0,0,1-1.9771,1.0918,1.849,1.849,0,0,1-1.9184-1.9478c0-1.6523,1.3872-2.582,3.8808-2.5674v-.2065c0-.8555-.1479-2.0806-1.6381-2.0654a2.6629,2.6629,0,0,0-1.5494.4721l-.1914-.5019a3.307,3.307,0,0,1,1.8443-.5606c1.7114,0,2.2134,1.2393,2.2134,2.6412Zm-.6788-2.14c-1.24-.03-3.1723.1475-3.1723,1.9034a1.2775,1.2775,0,0,0,1.2837,1.4609,1.8507,1.8507,0,0,0,1.83-1.4019,1.5774,1.5774,0,0,0,.059-.4277Z" style="fill:#808285"/><path d="M133.8558,39.4571c0-.7231-.0293-1.1362-.059-1.6821h.6347l.044,1.1655h.03a2.3424,2.3424,0,0,1,2.1689-1.313c.708,0,2.2134.3985,2.2134,2.7886v4.4414h-.6787V40.5343c0-1.1953-.3985-2.3164-1.7266-2.3164a1.9956,1.9956,0,0,0-1.874,1.5786,2.7894,2.7894,0,0,0-.0738.6343v4.4267h-.6787Z" style="fill:#808285"/><path d="M145.1151,44.5773a3.8973,3.8973,0,0,1-1.8.413c-1.8442,0-3.0835-1.4165-3.0835-3.6152,0-2.1543,1.3277-3.7476,3.3052-3.7476a3.114,3.114,0,0,1,1.5933.3833l-.2359.5757a2.8259,2.8259,0,0,0-1.4316-.354c-1.7114,0-2.5376,1.5049-2.5376,3.128,0,1.8593,1.0327,3.0395,2.5083,3.0395a3.4141,3.4141,0,0,0,1.5049-.354Z" style="fill:#808285"/><path d="M146.4442,41.1837c0,2.4492,1.1655,3.2163,2.4195,3.2163a3.3738,3.3738,0,0,0,1.7119-.3686l.1621.5312a3.975,3.975,0,0,1-1.9624.4277c-1.9033,0-3.01-1.4755-3.01-3.5708,0-2.3315,1.1948-3.792,2.8623-3.792,2.0508,0,2.4639,2.0362,2.4639,3.128a3.7546,3.7546,0,0,1-.0147.4282Zm3.94-.5464c.0293-1.21-.4868-2.4346-1.8443-2.4346-1.3427,0-1.9477,1.3282-2.0659,2.4346Z" style="fill:#808285"/><line x1="150.4791" y1="32.7638" x2="60.2701" y2="32.7638" style="fill:none;stroke:#bcbec0;stroke-miterlimit:10;stroke-width:0.25px"/></svg>
\ No newline at end of file
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment