README.md 6.05 KB
Newer Older
Samuel Courthial's avatar
Samuel Courthial committed
1
# Week 1 (27/01/2020):
Samuel Courthial's avatar
Samuel Courthial committed
2
- We first tried to understand what was asked. We rapidly moved to the installation and the introduction to the tools we need (like nmigen).
Samuel Courthial's avatar
Week 2    
Samuel Courthial committed
3

Samuel Courthial's avatar
Samuel Courthial committed
4
##### About NMIGEN:
Samuel Courthial's avatar
Samuel Courthial committed
5
- We created a file test.py to test some basic features like creating a signal and changing it in a module.
Samuel Courthial's avatar
Week 2    
Samuel Courthial committed
6
7


Samuel Courthial's avatar
Samuel Courthial committed
8
# Week 2 (03/02/2020):
Samuel Courthial's avatar
Samuel Courthial committed
9
- We continued our learning of nmigen and tried to represent float with it, in order to properly implement the perceptron algorithm.
Samuel Courthial's avatar
Samuel Courthial committed
10
- However, the float problem remains for now.
Samuel Courthial's avatar
Samuel Courthial committed
11
- We also thougth about the use of I/O in nmigen, and the way to use it with the FPGA.
Samuel Courthial's avatar
Samuel Courthial committed
12
- Furthermore, we searched more information about the perceptron.
Samuel Courthial's avatar
Samuel Courthial committed
13
14


Samuel Courthial's avatar
Samuel Courthial committed
15
# Week 3 (10/02/2020):
Samuel Courthial's avatar
Samuel Courthial committed
16
17
- We continued our research on the perceptron algorithm (following [this link](https://machinelearningmastery.com/implement-perceptron-algorithm-scratch-python/))
- A first implementation of the algorithm, with and without nmigen, have begun being developped.
Samuel Courthial's avatar
Samuel Courthial committed
18
19
20

# Week 4 (17/02/2020):
- We finished a first version of the perceptron in python, without nmigen (available [here](https://gricad-gitlab.univ-grenoble-alpes.fr/Projets-INFO4/19-20/19/code/blob/samuel.courthial/perceptron_python.py))
Samuel Courthial's avatar
Samuel Courthial committed
21
- We continue developping a perceptron using nmigen (available [here](https://gricad-gitlab.univ-grenoble-alpes.fr/Projets-INFO4/19-20/19/code/blob/samuel.courthial/perceptron.py)). Especially, training a linear regression and passing the parameters to the perceptron.
Samuel Courthial's avatar
Samuel Courthial committed
22
- We worked on a way to load and use csv files containing datasets to train and test our perceptron.
Samuel Courthial's avatar
Samuel Courthial committed
23
24
- We searched for interesting datasets to test our perceptron (we looked at [handwritten numbers datasets](http://yann.lecun.com/exdb/mnist/) and [red wine quality](https://archive.ics.uci.edu/ml/datasets/wine+quality)).
- We tried to understand the platform file of the FPGA we will use ([Lattice-Ice40 platform file](https://github.com/m-labs/nmigen/blob/master/nmigen/vendor/lattice_ice40.py)).
Samuel Courthial's avatar
Samuel Courthial committed
25

Samuel Courthial's avatar
Samuel Courthial committed
26
# Week 5.1 (02/03/2020):
Samuel Courthial's avatar
Samuel Courthial committed
27
28
- We tried to improve the perceptron (without nmigen), using a training set and a test set as well as a much more complete dataset.
- We carried on the developement of the perceptron using nmigen. Our implementation is almost complete.
Samuel Courthial's avatar
Samuel Courthial committed
29
- We managed to give our dataset as an input with nmigen.
Samuel Courthial's avatar
Samuel Courthial committed
30
31
32
33
34

# Week 5.2 (03/03/2020):
- We finished the preceptrons (with and without nmigen)
- This session was used to try to install the FPGA. We tried to create a blinky program to make one of the FPGA led blink.
- We searched more informations about multilayer perceptrons.
Samuel Courthial's avatar
Week 6    
Samuel Courthial committed
35
36
37
38
39

# Week 6 (10/03/2020):
- We added a test to the perceptron (without nmigen). The fact that this implementation include training allow us to use its results for the nmigen implementation.
- We searched for information about multilayer perceptron (like [this](https://pathmind.com/wiki/multilayer-perceptron), [this](http://deeplearning.net/tutorial/mlp.html) or [this](https://github.com/TioMinho/NeuralNetworks_XIIISAC/blob/master/Part%201%20-%20Representation.ipynb)).
- We began implementing a multilayer perceptron in nmigen.
Samuel Courthial's avatar
Samuel Courthial committed
40
41
42
43

# Week 7 (17/03/2020)
- We tried to completely understand the concept of layers, and how to implement it with nmigen (using the help of [this](https://www.youtube.com/watch?v=u5GAVdLQyIg) and [this](https://www.youtube.com/watch?v=IlmNhFxre0w) videos).
- We continued the implementation of the multilayer perceptron.
Samuel Courthial's avatar
Samuel Courthial committed
44
45
46
47
48
49
50
51
52

# Week 8 (23/03/2020)
- We searched how to implement a handwritten digit recognition, using our multilayer perceptron.
- We thought about how we could import and normalize the data found on [this site](http://yann.lecun.com/exdb/mnist/).
- We used nmist handwritten digit datasets (imported in [this file](https://gricad-gitlab.univ-grenoble-alpes.fr/Projets-INFO4/19-20/19/code/blob/samuel.courthial/digit.py), then used in [the perceptron without nmigen](https://gricad-gitlab.univ-grenoble-alpes.fr/Projets-INFO4/19-20/19/code/blob/samuel.courthial/perceptron_python.py)).
- We tried to recognized a 2 with our single layer perceptron (without nmigen).

##### About this:
- The recognition isn't particularly efficient with a class "is a 2" and a class "is not a 2", but it is still far better than a random choice. (numbers given in [perceptron_python.py](https://gricad-gitlab.univ-grenoble-alpes.fr/Projets-INFO4/19-20/19/code/blob/samuel.courthial/perceptron_python.py)).
Samuel Courthial's avatar
Samuel Courthial committed
53
54
55
56
- The point of continuing to improve the perceptron without nmigen is to be able to create our own weights, that we could use.

# Week 9 (30/03/2020)
- We searched a way to put neurons (in json) into the fpga, as a large number of neurons could be used for the final perceptron.
Samuel Courthial's avatar
Samuel Courthial committed
57
58
- We began an implementation of an multilayer perceptron without nmigen (see [mlp_python.py](https://gricad-gitlab.univ-grenoble-alpes.fr/Projets-INFO4/19-20/19/code/blob/samuel.courthial/mlp_python.py)).

Samuel Courthial's avatar
Samuel Courthial committed
59
# Week 10 (07/04/2020)
Samuel Courthial's avatar
Samuel Courthial committed
60

Samuel Courthial's avatar
Samuel Courthial committed
61
# Week 11 (14/04/2020)
Samuel Courthial's avatar
Samuel Courthial committed
62
63
- We continue our implementation of a mlp without nmigen, trying to do the backpropagation, as it could be useful for our mlp with nmigen. To this extend, we followed differetn tutorial and explanations (such as [this python tutorial](https://medium.com/machine-learning-algorithms-from-scratch/digit-recognition-from-0-9-using-deep-neural-network-from-scratch-8e6bcf1dbd3) and [this wikipedia page](https://fr.wikipedia.org/wiki/R%C3%A9tropropagation_du_gradient))
- We also used the weights found with the single layer perceptron without nmigen, in the one using nmigen, to ensure that everything was ok.
Samuel Courthial's avatar
Samuel Courthial committed
64

Samuel Courthial's avatar
Samuel Courthial committed
65
# Week 12 (21/04/2020)
Samuel Courthial's avatar
Samuel Courthial committed
66
67
- We began to search a way to use our perceptron (with nmigen) with the FPGA we had.
- We faced a problem here: the implementation of our inputs is related to our perceptron and it doesn't rely on the inputs of the fpga. So we can't pass the inputs to the fpga wthout recompiling everything. We still tried to make our perceptron work this way. However, the debug is long and not many tutorial are available on the internet.
Samuel Courthial's avatar
Samuel Courthial committed
68
69
70

# Week 13 (28/04/2020)
- We continued to search a way to use our perceptron on the FPGA.
Samuel Courthial's avatar
Samuel Courthial committed
71
- We tried to understand how to use the UART pins of the FPGA and how we could connect it with serial communication to give inputs to our algorithm on the FPGA.