Skip to content
Snippets Groups Projects
Commit e2986ca5 authored by Jean-Luc Parouty's avatar Jean-Luc Parouty
Browse files

test gitlab/jupyter/markdown bug image

Former-commit-id: 00ba2083
parent 17b383f7
No related branches found
No related tags found
No related merge requests found
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
![Fidle](../fidle/img/00-Fidle-header-01.png) ![Hello World](data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAyAAAABbCAYAAACLfeH3AAAACXBIWXMAABuhAAAboQH6k93iAAAgAElEQVR4nO2dXWxcx3XHZ2R9xLEha/OQD7ih5CXyIAaIFdBoU7SBaIAEiiKw6QcK9Uch+YV8tIw+kFAhUIIAgywKWH4kUcQSYCQAWcDyg4siJGAZafpEInLSsC9ZWqIb5+OBlJXYiULLtzjrM9TZ2Zl75969H7vk/wdcibt7Z+7c+T5zzpzRURSpXmGjf2BIKXVZKXWtr7F2oWcSDgAAAAAAAGiyr8eygQSQx5VSt7sgLQAAAAAAAICU9JoAQrzX11i73AXpAAAAAAAAAKSk1wSQ60qp0S5IBwAAAAAAACADXbsHZKN/4JhS6gKbXZ3oa6zdtn5TfY21m5UmEgAAAAAAAJCKbteAnFZKHVVKnTVfsPBxQyn1/kb/wJVqkwcAAAAAAABIQ9cKIKzduKqUepfMrjb6B47wT6QVeYT/vlFhEgEAAAAAAAAp2d/NGdbXWDtj/t7oHzjBAofZA0KCybXqUgcAAAAAAABIS0+dA6I+F0RoL8iZvsYahA8AAAAAAAB6jF50wzsK4QMAAJLRWkfiWkKWAQAA6Aa62gRL3Te9Mvs/bvY11q5XnCQAAAAAAABARrpKAGEPV6N8nRCbzeU99N8t3g9CmpBr0kVvGTz73AtDP/zBG7kKQrw6OZwlbBRFuqx3B9nRWo8ppcaVUutKqakoirb2Snba9Rt1FpTBXm5zuwmtNZVhTfQhdb4k6+JaVUotR1G0vtfzLm/Ql1fHbuvPukIA2egfGGLvVicDgxzl62ml1Osb/QPkLesCzgUB3YrWmjrsBZE8GjxHUGAAFAPa3K5iLGCBrk0o0VovK6Vmoyha3usZCHqb3difVboHhMyrNvoHSJPwTgrhw8VpPhfkQnGpBeA+WusarcpprefoCsiaQetzJm0X6IwM5Qa6iJTlhzYHqMyXtNYLVHf2fG6AXmbX9Wf7Sfvg21fBE/obeW/65jM9KO6X8oxXKTVNZlzSfW8Pg5W67mZTpC5kdc1Wla7ulYzqMtKWG+jd8kOb273Q+LgqTVC01kYDMswaE6kNaX7WWo/ADA/0KLuuPyMTrHc2+geetIWQjf4B2ocxrZR6L8/zNnhTOZ1g/nhecVqcpn0ivS6EQGW8u4iiaJ4HyEnuOCb2ep4AUCRoc7sX1/jI+z3WWTCd0lpPctkbzccga0MghICeYzf2Z8YE6xpvAG8ihASVp6Cw0T9AQsH1AoUPAwkhV8JuBaAcoiiiTWN09s4TURRhNRaAgkGb27tEUTTLmhIpbJAQAtNL0JPstv5sH3uUesQIIUJIaPNA1Qkb/QNnacN43vHGcJq1OAAAAADYY/AkzTZnHuMNvQCACiEBxHiOIq3E+0UICayNeLWC17xcwTMBAAAA0AWwEDJlpWQSZQNAtexjbUccH2VNIW02Zy9Xpyt6y6Os0dmzkN9o9hizJE5EXmGvIOMhnkFotcg6UXlY/DbO8TV/s8LVYsINcxpM2Aan0/b04Lt/he+3fcHHvQdtQpwU+bEpnr3Ev3nzQ76L9ZOdP85Tp7OcSp1H+XE8gzFlQfkyI56xyX+Ph8SdF0WlMWu5Za33jufnUoZWnCY/Vqw4vW0oIb5UddN6l9j3d6R70mp/Mj/a2mAn7S7rSfA59Ztd0eY67fdEPKXUkYKYt0yxhkPHDi7vBc6vPNpaR/EF1ivZLwSXcdXkkdc51ves857C2n1IGyyr37H68kjk8YL1zEnx+2ZLJLfqx0/cqh+PYq5rURSptBfHeyMh7jKum1nSH3f9w7PPD+UdJ22Oo8Uac+UQH3n9aMg4PRdViMmA+GT4Yf5uzo4zKRxvCFxISNO4CF9zPcd3vyftIc8016YvvsDw5lpKCN/2e5HlF1MWMwnxUxoGy6rfRaSxk3LLWu+LKkOONyk/6JpJU+fS1M0s/RV7JEpqx872nGP5hbxXof1myW0ul36vrDpSdFyO+pf0vsOBdWEm8Pm5xefpl5LqFZXxWA71Kte5Sl55k3d9j8nnrPOejtt9aBssut9hDWJSHi/wcyd9ad7f11i7sdE/cIsP9nOR+sRv3ntxpcT9HnGQFuQEvWcXpKUU2PvHjONZxnOItH9tVkySZqMoCvaqQFI0n8iZloUA/9W0crEVRdEid3ZJKx/y/jbI4wmfICpZ5RWxmhV/LSY+6XlFvsOWwyVe5g1iZZQfE1IWdfYc01+R55g80phbuYXW+6LKkM++CGl3tOoUcFvx8MrlkvBGlMRubneltbkc+73dgl03vOMKrwjbm9V9eddcUY+rB3nH54g/pF9oTtC11hPkUSk07qLJK2/KqO8Fz3uKHGtze36KMWiM89nv0TX6XFtxOUYTcSyNJJsQV57XlVv140O36scvBMR5Ng8p3VzdrAHhimFL+m0rdXzfZorVhZaVI4fkvBSw6r8i/qbBvh7dX3VccsQ5GXP/in1/Qr4s8DMo7TXrN9fqQMO+L8tKRNowRZVfQlnMmZUP7oQmHXHPlVG/i05j2nLLUu8LbIOuFayddhHTNhLfN0O+hJbnoOMdV/jd69Z94ynrcNryi3v/MvrNUttclHO/V1QdCXyPPDQgg4F98LCj3Aate+qOdubULuQdn6Ms5Ir8kgnH5evqDzZl26uiLArM6yLH+TzmPR21+4z9eZ7Pd2k+ZmSZOeZzDV+am/+QkOGZuN8IrUgcR1kmV1fEc0OEkExmZL6rWwUQrlCyMm0mmKa4JgfDARW6IeJ33u8Jl/QM38SpTVXKnYl9v1elGtLhOjqnuPiCOoI0YYosv5iy8Jmb2YN1VMagVXQa05Zb2npfVBk63jUpXpe5UxUCiN1GOxVkOyk/34SzrH6z1DYX5dzvFVVHAt8jrwW6ljbk+L1mTZhWYgQyewxqWwTLO76EeuVsW56xMnM7zLEscs+bgsf5Tuc9Hbf7jP15Ls/nftK+L26+kTgGNc8B6WuskSesq6qdoAMI2cXujRLO9zBcUffPFXlHKXU24f4jJaUrNxybK52X9bxJy8xhNs5XdAfeQczmvZEMByZOxYSZdXw371IXs4rQ/t6rUudDqpKw4yvbVWNZ5Sfjd6riOe6q80N1WRpD6n1RZWirvJPineBD2SqDTZpkm1zMYCZYBmW2u1Lrc4/0e1XhMgkct05Qn/CZo/D3sh7UHe59847Px7KvbXG89nkomZxf5EzueVNwfe9k3lP1OJbH8+0+Lm4uZ8agWJPYfeLvC47fY/d/bPQPDG30D9xgF7tl7vcg71pHhJvdpGf3nACSFu5M5CRlnQ9iioUrpWy0od5BYgfpGLz2lh5bzLj77UYT7BHLE9+61WA6ii8NFZSfcnQ6NnbnUlp+CLotjd56X1QZZo3XI9CXiUzzVjee3FtBu+u6Nldlv9eFyLqwmDTG8QSspR5Yt+Qdnw9bILbjdS3Ylerl0EFZeWPH00l9zzrvqbrd5/H83MegHQGEtSAXrd+HXIFY8LjO2oeytB4SIywNBboJPhZwT69jN8Y0GwmzSN9ZNrGtBqxQtDSEgJUG2RmkdonowD41tyzKLr/UZVFyfqguTWNcvS+qDLPGW9lmYt54LgexxYqcGCRRZrvr5jZXVb/XNTjqbOgqt7xvJ+/yji+GrcBJsV23KxM0S8wbH1nreynznpzbYMfPd2zuD+0nY8t1v/zQ11i7wB6sjFAxTaejC03ICaXUaIzHrLJ4nA9QDPVs1XMesOi4/ZRB7M4kjYrQ7rySVLNbgarOtnAZwqSJM5VKmVc/TUdowlbVKZdZfiqkLNirSIpk5E63pTGp3hdVhpnirbj87Ml4WpOFsii130yKsIwy67J+r0rs8staZ2WfICduecfnI2hFnoQUq25VWeZl5U2e9b2weU/B7T6P5xcyBu13fHeGBQ5j1nS6woME46D0nezCdFWF3RjTqAldHXGc6iyzq8uqYTOJcfbU0E2DbpnlB7KRVCZFlWEn8VZFJxP7MtkT7a6L+73ScBxklzQxmwucFMr8jBNC844vC8ti8t9Ne31yzZuC6nvPzntyoCUPM+yBcdImgPC5ILSp+/Xq3jV3Up9l0oO0NMY05g5dsNJdOLwKMplyk3aZoPx6n6LKMHO8FWIPWN2a5l3d7nqg3ysTeyKapAHpdIKed3y7iULyBvW9MApZtNjn+rKvsUZepl6r5DX93Oog7J45hBC0w53SktUprfNqJXkHoYN3NJu9detKLQAABIN+rw1bA1KplziQL6jvvYfLBKtJX2PtLHua6gbzq6vsavd6ik3vH7G73it76RR0A58WGrSa53DH1wsrrGmYswYf8mQR6zWkalB+vQ/KsPfYZWXWc/1ewdgbaWMnoRn2YcaSd3w50DX1Nae8QX3vMZwaEENfY+2M53yQsjnNAsiow1OXzUd8zzHWfOwFD1iqQy8KvWhjHgTb/cqBp1s7JZRf71NUGbaUZ4qzAaqkJS+6OM27st31UL9XClz/WrwuOTYU2+2s0/0XeceXB7IdVllfc80b1PfCKWQMihVAVHcJIdOsAaHrMaXUy0qpd8VFJmPP9DXWjggPWZf3yP4P5VjNSFNB7Ht3k2rafrcsbvTKAOXX+xRVhna8od5fstrt5mHv20lelMlubXdF93u9tpF9xvrsciOad50tqw0ETd4d/UGV9TXvvOmVcb5XsetKLmNQogCi7gshSZqHUN4LPLvDxVE+e8Sc0H6hr7E2xOeCkMAxutE/cJs30NO9o32Ntds5pbvbsVfybHVzHKlU0z1GS8eU0Y1eGaD8ep+iyjBrvGmeL59Rj1uR5N+SJgx2mrvh5GUXu7XdFdHv5V1HSkFrPWNNmFY9p0Lb5ddp+vOOz8dgYNuy62uVGpC886ZXxvleJWs/GVuuQQKI4jNCSMPQgfCgWGtBZlSdCgWP8+nr72z0D0QslLzOplrGffCLfY21vaL9MA1OrurQAJHoCYLvkVLqfI942clEwqBZz9gRdrwaiPKrhFxXcYsqQz5krOXATcfBUHac9gnfSdgDdlz8ifE68qJWgGcatLtAcur3cq0jZcDlZJen80R+Rzsbd7juDSbv+BIIyW95z1aVB5UWnTcFjfN7Fu4nWw6B1FqH1LnYvjRYAFGfCyHX+DDC9zIUxItsDnWjhIMMX2RPXnsNe1VnJq5Rsx2fVE1v7cLzI9IMmraaPg57NTCPySzKr3iKKDdJUWVoxzuXMGDPpZyg26uhzsGFnxkqSNhpntRaz6VIkwu0uzCK6PeKqCOFQPVCa73geLeJhFPD7X0DSx1OjPOOz0dSnbX7g9kuEJjzzJuixnlwn7T9ZOIYlEoAUZ8LITf7Gmsn2CQrVBtykQUPecAhCTFvpX1+AHtV+DCHw9gD4QqtAskVAfqbV4aWrHundqHq0l7lmbE3UFEj0lovpTS/sAexBdMYOX9TT4xQfqWQe7lJiipDNhmRk++aiLcu4qSVxBVRl0NXORctu2xqEytG08ITOpPeWoj5hicvxjnecZnX3AYnuR3GgXYXRhH9Xu51JCuuFW56P65XJHg0HO814TG92oHrgrynxhPjSV/dEnW3LR/zji+BJXtVWghitvYjtz0SnO+h184755w3RY3z4H55LTq02it2eVHZcD6PJ3pai6Io83WrfvzYrfrxK7fqx6OEa/RW/fiJW/XjQ3SJsGcDwoZeN+gZnbxP6PUPzz4/lHec3GlH5uowrjkZl7iW7OeIazwhzpZ4UqQlVbi0+RByP01SHO/b4LAN8d0mr9wlPp8le1c+mmsya14UUX5ZyzBruWct16LTmLbcOqj3RbTBmlVfk65xqz4n5c1YYLwr9r0Z88J31aoov27pN3Nsc0X0e4XUkbT9R8qL3nU45fN8daEh6sOK9dtcifHZeS3DbnriM78NVlgWrnExl7wpqL5naotFtfvQeAt8fs1Tr3yX3V+0xJtaAyJhbQhtUH8yQZtBGgm6r7lhfKN/gLxUvc/7ODql6XaXtDJ78bwPF1EUTbB60+VpwuW5ZSRpZajHmXBsojJ2oEZyX+XDimx3c86VGF7xdNoSdwrKrziKLDdJEWXIJhNPBGx23gpZ7XXEvxiQN8vcTtKcGO7LCx9e+2y0u1QU0e8VUkcKYpXbQT+vtgcTUxfqoj7Y5idec5S847PYssqw5olvletsV7lpzzFvcq/voK2sTF1LMjml/vEJ7i+8eA8iTANv9r6+0T9wjM/rOCNMrRT//VLOZfkRu9m9vIc8XQUTRdGs1nqeJVDTAE2jXeYKspxUQXYDptGwelrmxTp3RMtmIuHoiOo+d4UURmu9znk8yNcWx9mRSQbKrziKLDdJEWUo6vKwFa9J/3LMhujECQ3nzTKvEJq8MfkybyZyWuuW+Mm8IW5ik5AX6yYvOD9iJ0hod2EU3O/lXkcSMOHqHrvydeta7NQsjuoCTbQ4/wbFZNawLP5fT6oTecdnxd1cnBBlPSjKsKWsu5E88qao+g6c+Twl+slB4Qo6qe625LFmtUrubPQPjLLHq1FLGOkU0rRcq3Kfx7PPvTD0wx+8sWc8bAEAAACgO9Bay4kbTaxHUDSgG7Hq6hQLm01y0YC4YI9ZzfM6NvoHTrD51RB70UrjBetdsYH9OrQdAAAAAAAAdC+OE9NbtKKFCSAS3pthTiZvwkIJnVp+jC/F54OYfRy0v+RmN2cuAAAAAAAAoA1bAGkxwSxFAHGBDeMAAAAAAADsLhyH4a7be8A68oIFAAAAAAAA2N3wWUjeU+YNfM+C2JyuXGfPQAABAAAAAAAAxDHOhw96T0HnQyKXLPOrdZcAUpkJFgAAAAAAAKAnMC6SyQU3nYCueF+H2VzuOsOJfjvlcg0PAQQAAAAAAAAQh+sMnrjzpUjrMes7kwcCCAAAAAAAAMBLFEVPsOnVsDiA0Gg9toQ2ZDXkMFAIIAAAAAAAPUIURRplBaqAPVmt5vHoFgHk4qVXko5FX54+f27nxE2+f2r6/LlZ8d0Cq2lqlrpm5zh8koymz59rswfj8GRbNjN9/lxQA0ubZgAAAAAAAEB1uDQgZLO16EmRU2iwGGMhY9a6f5CvORIwLl56ZWL6/Dnfc9LSaZoBAAAAAAAAJeASQNanz59b7vDRyw7hohnnxUuv1FgIWWAhpM01VwbySDMAAAAAAACgYErfA8KmV6cuXnqFhJC5i5deWZ0+fy4XezIAehmt9aZSasQ+LVQSck+R8Aa0pSiKvoTKBgAAAHSG1vqEUmrUiuRaFEU3QiLWWlPYEzmGv83hbwaEPcJpP5Y2fGUHEU6fPzfB+0JmqkpDN8IHvGxqrSOt9YrvsJdug9LJaQ5KL79bIWUv8k9eCzH3U56vZHiOeWeX72v6fThNnvC+qaRTRkPuKZJcn895uFnVy5T1fK31klUfl/jApl0Nn5w7J9rkpmmLJu+11m2uHfn7Qf67rrVuiLyjvmPcun9c3NOwfxf31Tnumgrob+P6NUe66O9JK6yzbsl0UH5YdaPBeWanxdmv8Ts4+zdKj+nbZJ567h3jPDDl5O2fOV773Z15DgDwtqMTBw8e/C+l1E9pWnz06NHmRX/Td/Sb1vpYXHitNU3y36Qw3/jGN/7poYce+mcTXmt9jQUEX/jRL37xi78x4U+ePGme/6pS6n2t9eWk8Pv37/8/pdTrJvwjjzwiw1+IK/qqT0In86thNsva89CkhDfu97OXi1mP3+VuxEymQidVqyyAFkGNtQRaXKdinrOe0auDqbe+dx637gPuPNw1AlUMw6ZOKqW+xPVtrqpJm5lYF/0MpVSDP5p3HxH79Uzeuya6slyoD6ybtsz94oyZ7PP/43zYFf1+KqZM6b5VOhQrsL+N69fsdE3RfRyv/Q7edHDcI1Y8NT5xWNYPX7+2GtMHmT2Zsenh58yxz35TTsP2wowQqqg+T1lphiUDAIGQYKG1fvfQoUPfmZ6eVltbW+rmzZvNi/6m7+g3rfV7LiFEa32GhIyHH374L15//fVmmB/96EcP/+EPf9j//vvvN8MrpZ5WSl13CREc/s0vf/nLX6HwURSp69evN59P4U+fPk23vZQU/pvf/OZDb7755k7427dvq5/+9Kfq6afp0Wpaa33FlyNVu+E1HeNwzCbyPYHwrfwlc2JkFEW9lCdjYiCcSro5iqKJcpKVTBRFWfchmcGcVl+n5EmfvMI6xk4QBkVdB3scridTWustPlE2j31waSlU6GKtBk3EJ2Q/5nHhSBPd4SiKgtoIxcfta5KFhkl+zmrMMwzjnPeh/W1wv8bponsbLq2OKx2+eGg8NEKB1no9IW+av5EGQ74Dp4HeM7av5ftI+HhC5qHWeoSFoMkoimY5z5dY8Gipsz02VgFQOQcOHHjj0KFDD/34xz9+4MSJVuupI0eOqAsXLqjR0dEHvv3tbx+me5VSf2t+Z+Hlta997Wt//sUvfnGQ7ifBgeIZGhpS165da4Y/duyYevHFFx8ncyil1JAITw98/fHHH28KDRSe/qfLPP/y5cvNuDj8FWkiprUessNLKB2UhjNnzqirV6+eJi1NFEVt2pBKNSBi70evrPJXiqUib0gTDmHOMCn+l/cvSfU7fz8p38eo6+PCueDf6zyg1j1mDHbapInFoGWGMOMI33D9lhZPWlpMJYTZiDSXcU3Y6jz4bzlWIMdcmpU0ecvpWoozcQisE+PCfMPO29j08O8m71dcp57a93RqNmiZ08yI7731xHrXRkK5haajzWTF/i5FXXGxatpLQv3fFCYv0nzJWTdcpj/yO273S/y3eeZgXJ+QIfvMCn/IxHQ2gynuetpxw7SN0AWHpH7NhTh4y5u20HTw7/NCk+q7b4sX8GxT0GYfFLBfbJIPDWu5j+OVzzdlWoXADMCuQWt9dnt7+28uXbrUJnxI6DfSZNC9rHEwXIii6PDbb7990Ez+SdigCf/o6P2tJPSZNSEneZ9Hk8OHD/+LYkGDBIUrV66oJ598Ul28eLF5vfzyy83fKPxLL5ESRD3NQkuT/fv3v/HII484hQ8JxXvy5EnFmpC2G10CSP3ipVeGXVcXFz659Y08V0/YWYtVuwXXBIYHLakin7JMOIx6fYxNCmb5s7l/ncMblj2T5uWEcC7GeGBa9pgD+NIm33OVn/cEr4ga8wqz6jbLpiudulWOTYt4Xo1XRzUPznHPdQ3+4x6tXlDeihXkZWFWYtu9h9aJcTapaMnbpPSI+KeEiYo9QbXvoXdOFFp9cHwz/L5f4vTKyamznljvasIS3r0/AbhMVjqtK5JBYYboey/zzEk2MTrlqBtTbJJkl7sz3VznR/hvY8qzmtAnpGU4Rbh5nuCnMUeTWsV5fv+kOjdm2mRSfyvuj+vX2uD9YFsJGouxFBr/VUff4sJXdiHPidPQLnLZ1DuoCwCAVoaOHj2qzp4925zEk/BAF03Y6TKf6TfSZBAPP/zwX5kYtNbPkImTFF601uq1114jjUVTc2GgZzCjfN+RO3fujNDzb9y40byfLoKEDTLlIhOqmPDHPv3000cpXUb4oL/tizQy5jcZXuISQMZ5cGu7Ll56pVs1FfM8oLquXuowzYGJm7yqKgfGGZ4omgF00TUh5HuMScG8WIGkeweFacCi/CzU9YsJ4VzIgW4xZqCedU3OaDLAkyIzMZCrbs3j/jlNW+a+AOxNv958spjkCcQpmY8Jj1tm228zMd3JS8e7hubtJE8wZb7YphShdWKC89jO26T0+OK30zAl7pnl9866t2GSn2ns46fM+yTUE8OUCDvBk/mi+q0sdaUJC1ryXZPea16sUJsVaxNmUeZTB3j7hAxRDobuCeC8M/s6ErVHYs/HFIef4jSu+LSKYgItyyeuv1Up+jXzjB1hPOYeVzriWLeESV+/1lw4Mg4xUpbdoG8/nqXRCS5TAICfBx988DtGeCAtA+2boP9p0k4TdvpME38ziSctwhe+8IW/Fu3ycJzmREJCAmshzD6SZkAZv4EEmFqtJoWOpiD06KOPfiq8ZDXjkc835lvyoncghDDUto/FtQek5WTzIhECTaebkXfFOSA8EI/wADrDtsQjvOpfdwxabSt/PpMHGkhIQuZ41tnGd50Hw1mx2mer4VvC2fEKMwXz3GWTJkdcoROZHfMKWkkku2o2PZlIsbo8ErcKGZOWMcck24fJy0XOy3E7L/n7YVecCXk7bE9kOL7m3zzBCK0Tshy8pisyPSJddvw7cYk0uM78mUuyPbfhCeAgazBMmqj8lWePQNu7yHuS6m4OpKkriiePMu0TnnroKiP57m11g8ugzXNSGkL7hCLgPQbjog21YW2an5LpIiFEa21MuUgQmbCEwbZ38fW3nA/B/ZojXXHCRad56uzXuK6vCs1TqPlVR3C/vCOYsUYOABDD9vb2w0Z7YIQPI4iQCRSZTUlTKrr3448//mqeeUoCBMVLmpBbt261/Pbuu+82BQcSRCgd+/btI2WF19ZKalxkmpOo2guWGSyxqiLgFVGjvZkRk/Et6z7z2efZpMXFo+MWudK6s9oXEE5iBp8G37si4gtGunU09umCEX7HMlw9ppmsynvlCqlcOW2JK0Xe1hPMzTLVCZuY9DjjD0kDpzvL3gvTH0j3niZNzWck1BMvHE6uHOfhejetYCM9GPVbG4bTvFdb3Uhb7jE4+wRJYF6m3qMhTMl8ArIW5oht97GGdELEI/Ni3Kd1cPS3Kk2/JtJF5nOTCe6VvenwkEbr4OuDkvCWlRBo1/lqEXBZ+6eFNgkAkMDBgwf/h8yfFGtASPPw2GOPNf8m0yoSQkhAMEIIaSoOHDjwMxmr0TAYXn311Z2/aQO55Je//OWnfDZHMzrFWgtCCjok+NBFHrBICHnmmWeaGpAPPviAZIWWM0Wk9sTsHZGX+V3c15rgLhBAxlh7UZQ71l5nnle0zOSrZZAQn335t8SrYNqzMmXse8csdX1SOMmYdMco3VGG5j3b+I+xF5a2wYwnFiOeiUXeZJ08G/MVOy9tQvM2KR1Z60Roekz8IWmw7+lI45aTCQ4AAAexSURBVGC5Gd3Zo5BUTxLinLXiy+Mgxax1pYUM79X23Azl7sPXJ+wQmJfLLocFcbBAtpxkSma8RPn6GLGvyzi5GObPSRPyebHfInW/FmM+p1KmQ5JmL40pu+GUpnPLMftMhrl/WE+4DwAQyCeffPKf7733XnMPBgkLtOfinXfeaX6mTeH0mf6nzzSBp3vv3Lnz3yL2t77//e9/JoUQaRJlm0f96le/ImunpsRBhwMeOnTo17TXRLGwwiZazf0nFKd0BSyeYcJf379//8cmPH/XdhmtiLjvup07lQkgbH6VxhZ2L1IXavR1x+A3JgaHFswgxPbRzokkh1vlVb9VVuMnhjM4zBQMZiAMnYBM8mAfu4rLpg21tBOblCTaenvSZsrI2OdnKhNB22AvDzzMUifSpIfjd3n3sk271h0TrjSbVXeEFTYt2YpZQQ6qJznSsjLsWJnPVFdyeC/Xc2W5rzvSG6SNcPUJqd7kPvPsnjrthHU2cP+Qd3O2QxgbC3QO0KyLHfZrcZP00HSY95jk9hY0Roqyn0lpfrXIe9hsj281S2Mzz3uqcOAgAJ1x+YEHHvjDU0891RQijPtcg/lMvz311FP3SGCgMOKJl3//+9/vI41JHBT++eef/7PW+g670m1y9+7dc2R2ZcIbb1X0HWljSAChsKQduXfv3r19+/Z9EEXRNRP+008//VfSkNiaFhsSoEgbcuDAgZ+4TmWvRADhgwfJdnS1rP0m3Q67wZwTG5mNNyCTP1Os3h9X9wenmZgNj2YSMs5x+jxZLVq2/qHhlLBnbpmkiElMmkMJx9R9oUaeJlwXHrHGeVK8yqZDSwUMhrM8yMqy8D3DnmjMezzKmPvS5O0sTwrMuxs7dTl5SVsnbJLSYzROpmzGHJPDWWl2IiZNs0ll5JjgmPjmRHw1IZB460lBmEl0jZ9ne9RKU1fiSPte85anOLOh3WzKXpeaBNGXSEzZuybLdp+QGuE0wZzUbfJnMM6VNguhIc9dZY1jTZ4aLsppnhdUai6PdAn9beZ+zezPiJnMJ2ol2MRtictvJKUQuJjWcQCneZb3KMl6uORxhDHHeSfzochFIQB2FVEU3b53794/kmnTt771rc+MOZSEvqPffv7znz9w9+7d5yiMaLMU4LW33npLfe9739smgYFMpYwJleLwX//61+99+OGHB6MoetoKT8LI1atXrzaFEHMOiNFiUHwkPHz3u9+99/HHH3/82WefPSXTxmd6vEvuemmfiG0OplizQuFJ+Nne3n7BVX6lCyDsztecQBt3OvVepCZsjs3hWs3BmM0OJniyF/EgeErYkW/JySkPWqfM5kqecCw79hWYle7FlOFUwgGS0jXtliO8/G6CJxORmTyI1csaT7YiMSCbw/185glbDm8xm47ntqWF33+E6+cmP3fYo63Ysp6/yOmWebJsxR2XtzIdq5yOMZEvs1Z8qeqE51296eG6Z4SQSLi43ZkQOe4ZFpMmZxnxuSERP3NZejbjv2dFfA2xeh9XT1zvGve9/F1ZdYWuBU4LpX2T+6x5R3mmqSu+dMS9V1tYR92YcWxoP8Wr9eZ3u9zWudyWHOd8tPQJWeG6YeePFOJ8eTJr/dZ2n9iIberGgoh/URx0Os7aAFc/4etvO+nX5D3y97h0tPRXps2QaZulxYjr1+SztzxCnDc8a0FlO7bzUan7ZfoEf5wTaQ46gBYAsNOWSKPwzIcffvgJncFh9nzQRcIEffe73/3ut0qpJ1ngaCGKInJVdfHtt98+QPtHjOcqo1Gh8CQ8xIQn9cdrJIQcPXo0oueSBy6Kh55P+z8++eSTD6MoOunSXrBb3bdIY0LhSZCR4Uk4+dOf/vS/HP6mI7wi29adD3SORhovWK77+btFqwOs8UAh7Yonps+fa+u8L156xazixtlB7+wb4efNJwyWq65nZeXZ514Y+uEP3mgXWXsQWslSXXYyeQi8+ksn/z7ReWwAZQQMvdonuGDhajGF++5dnQ4AQHfBB/SdOXz48N9vb29/S31+SvrP7ty58x9kHSU1Fy7oXA46buPQoUN/effu3eMc/ifb29v/HhiebL/OPPjgg3/3xz/+8SsZw58l18IUnvaHaK1vbG9v/xtrWry43PB2inE7KtWzW7yqtsyCR4htapwnmHnLxed4gt1w3ueBOKW5XoNV6GNiRauXGMb+oa4HZdRj9Hif0Ea3CL8QwgEALniCf9na4xEMaxfiN4PEwNqRzAvqnYRv0YCAvQPbqs9ZJjs9A6+uz6c4FwSUDMqot+j1PgEAAEDvAAFkj8KTjVIOGQMAdD/oEwAAAJQFBBAAAAAAAABAaVR9ECEAAAAAAABgDwEBBAAAAAAAAFAaEEAAAAAAAAAApQEBBAAAAAAAAFAaEEAAAAAAAAAApQEBBAAAAAAAAFAaEEAAAAAAAAAApQEBBAAAAAAAAFAaEEAAAAAAAAAApQEBBAAAAAAAAFAaEEAAAAAAAAAApQEBBAAAAAAAAFAaEEAAAAAAAAAApQEBBAAAAAAAAFAaEEAAAAAAAAAApQEBBAAAAAAAAFAaEEAAAAAAAAAApbHfPOjXv/ntCaXUEWQ9AAAAAAAAIAduf+2rX7lhRwMNCAAAAAAAAKAclFL/D6EYkEAUh/RfAAAAAElFTkSuQmCC)
![Fidle](00-Fidle-header-01.png)
# Deep Neural Network (DNN) - BHPD dataset # Deep Neural Network (DNN) - BHPD dataset
<!-- INDEX : Simple regression with a Dense Neural Network (DNN) - BHPD dataset --> <!-- INDEX : Simple regression with a Dense Neural Network (DNN) - BHPD dataset -->
A very simple and classic example of **regression** : A very simple and classic example of **regression** :
## Objectives : ## Objectives :
- Predicts **housing prices** from a set of house features. - Predicts **housing prices** from a set of house features.
- Understanding the principle and the architecture of a regression with a dense neural network - Understanding the principle and the architecture of a regression with a dense neural network
The **[Boston Housing Dataset](https://www.cs.toronto.edu/~delve/data/boston/bostonDetail.html)** consists of price of houses in various places in Boston. The **[Boston Housing Dataset](https://www.cs.toronto.edu/~delve/data/boston/bostonDetail.html)** consists of price of houses in various places in Boston.
Alongside with price, the dataset also provide information such as Crime, areas of non-retail business in the town, Alongside with price, the dataset also provide information such as Crime, areas of non-retail business in the town,
age of people who own the house and many other attributes... age of people who own the house and many other attributes...
What we're going to do: What we're going to do:
- Retrieve data - Retrieve data
- Preparing the data - Preparing the data
- Build a model - Build a model
- Train the model - Train the model
- Evaluate the result - Evaluate the result
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 1 - Import and init ## Step 1 - Import and init
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
import tensorflow as tf import tensorflow as tf
from tensorflow import keras from tensorflow import keras
import numpy as np import numpy as np
import matplotlib.pyplot as plt import matplotlib.pyplot as plt
import pandas as pd import pandas as pd
import os,sys import os,sys
from IPython.display import display, Markdown from IPython.display import display, Markdown
from importlib import reload from importlib import reload
sys.path.append('..') sys.path.append('..')
import fidle.pwk as ooo import fidle.pwk as ooo
ooo.init() ooo.init()
os.makedirs('./run/models', mode=0o750, exist_ok=True) os.makedirs('./run/models', mode=0o750, exist_ok=True)
``` ```
%% Output %% Output
FIDLE 2020 - Practical Work Module FIDLE 2020 - Practical Work Module
Version : 0.2.9 Version : 0.2.9
Run time : Tuesday 18 February 2020, 14:42:02 Run time : Tuesday 18 February 2020, 14:42:02
TensorFlow version : 2.0.0 TensorFlow version : 2.0.0
Keras version : 2.2.4-tf Keras version : 2.2.4-tf
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 2 - Retrieve data ## Step 2 - Retrieve data
### 2.1 - Option 1 : From Keras ### 2.1 - Option 1 : From Keras
Boston housing is a famous historic dataset, so we can get it directly from [Keras datasets](https://www.tensorflow.org/api_docs/python/tf/keras/datasets) Boston housing is a famous historic dataset, so we can get it directly from [Keras datasets](https://www.tensorflow.org/api_docs/python/tf/keras/datasets)
%% Cell type:raw id: tags: %% Cell type:raw id: tags:
(x_train, y_train), (x_test, y_test) = keras.datasets.boston_housing.load_data(test_split=0.2, seed=113) (x_train, y_train), (x_test, y_test) = keras.datasets.boston_housing.load_data(test_split=0.2, seed=113)
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
### 2.2 - Option 2 : From a csv file ### 2.2 - Option 2 : From a csv file
More fun ! More fun !
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
data = pd.read_csv('./data/BostonHousing.csv', header=0) data = pd.read_csv('./data/BostonHousing.csv', header=0)
display(data.head(5).style.format("{0:.2f}")) display(data.head(5).style.format("{0:.2f}"))
print('Données manquantes : ',data.isna().sum().sum(), ' Shape is : ', data.shape) print('Données manquantes : ',data.isna().sum().sum(), ' Shape is : ', data.shape)
``` ```
%% Output %% Output
Données manquantes : 0 Shape is : (506, 14) Données manquantes : 0 Shape is : (506, 14)
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 3 - Preparing the data ## Step 3 - Preparing the data
### 3.1 - Split data ### 3.1 - Split data
We will use 70% of the data for training and 30% for validation. We will use 70% of the data for training and 30% for validation.
x will be input data and y the expected output x will be input data and y the expected output
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
# ---- Split => train, test # ---- Split => train, test
# #
data_train = data.sample(frac=0.7, axis=0) data_train = data.sample(frac=0.7, axis=0)
data_test = data.drop(data_train.index) data_test = data.drop(data_train.index)
# ---- Split => x,y (medv is price) # ---- Split => x,y (medv is price)
# #
x_train = data_train.drop('medv', axis=1) x_train = data_train.drop('medv', axis=1)
y_train = data_train['medv'] y_train = data_train['medv']
x_test = data_test.drop('medv', axis=1) x_test = data_test.drop('medv', axis=1)
y_test = data_test['medv'] y_test = data_test['medv']
print('Original data shape was : ',data.shape) print('Original data shape was : ',data.shape)
print('x_train : ',x_train.shape, 'y_train : ',y_train.shape) print('x_train : ',x_train.shape, 'y_train : ',y_train.shape)
print('x_test : ',x_test.shape, 'y_test : ',y_test.shape) print('x_test : ',x_test.shape, 'y_test : ',y_test.shape)
``` ```
%% Output %% Output
Original data shape was : (506, 14) Original data shape was : (506, 14)
x_train : (354, 13) y_train : (354,) x_train : (354, 13) y_train : (354,)
x_test : (152, 13) y_test : (152,) x_test : (152, 13) y_test : (152,)
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
### 3.2 - Data normalization ### 3.2 - Data normalization
**Note :** **Note :**
- All input data must be normalized, train and test. - All input data must be normalized, train and test.
- To do this we will **subtract the mean** and **divide by the standard deviation**. - To do this we will **subtract the mean** and **divide by the standard deviation**.
- But test data should not be used in any way, even for normalization. - But test data should not be used in any way, even for normalization.
- The mean and the standard deviation will therefore only be calculated with the train data. - The mean and the standard deviation will therefore only be calculated with the train data.
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
display(x_train.describe().style.format("{0:.2f}").set_caption("Before normalization :")) display(x_train.describe().style.format("{0:.2f}").set_caption("Before normalization :"))
mean = x_train.mean() mean = x_train.mean()
std = x_train.std() std = x_train.std()
x_train = (x_train - mean) / std x_train = (x_train - mean) / std
x_test = (x_test - mean) / std x_test = (x_test - mean) / std
display(x_train.describe().style.format("{0:.2f}").set_caption("After normalization :")) display(x_train.describe().style.format("{0:.2f}").set_caption("After normalization :"))
x_train, y_train = np.array(x_train), np.array(y_train) x_train, y_train = np.array(x_train), np.array(y_train)
x_test, y_test = np.array(x_test), np.array(y_test) x_test, y_test = np.array(x_test), np.array(y_test)
``` ```
%% Output %% Output
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 4 - Build a model ## Step 4 - Build a model
About informations about : About informations about :
- [Optimizer](https://www.tensorflow.org/api_docs/python/tf/keras/optimizers) - [Optimizer](https://www.tensorflow.org/api_docs/python/tf/keras/optimizers)
- [Activation](https://www.tensorflow.org/api_docs/python/tf/keras/activations) - [Activation](https://www.tensorflow.org/api_docs/python/tf/keras/activations)
- [Loss](https://www.tensorflow.org/api_docs/python/tf/keras/losses) - [Loss](https://www.tensorflow.org/api_docs/python/tf/keras/losses)
- [Metrics](https://www.tensorflow.org/api_docs/python/tf/keras/metrics) - [Metrics](https://www.tensorflow.org/api_docs/python/tf/keras/metrics)
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
def get_model_v1(shape): def get_model_v1(shape):
model = keras.models.Sequential() model = keras.models.Sequential()
model.add(keras.layers.Input(shape, name="InputLayer")) model.add(keras.layers.Input(shape, name="InputLayer"))
model.add(keras.layers.Dense(64, activation='relu', name='Dense_n1')) model.add(keras.layers.Dense(64, activation='relu', name='Dense_n1'))
model.add(keras.layers.Dense(64, activation='relu', name='Dense_n2')) model.add(keras.layers.Dense(64, activation='relu', name='Dense_n2'))
model.add(keras.layers.Dense(1, name='Output')) model.add(keras.layers.Dense(1, name='Output'))
model.compile(optimizer = 'rmsprop', model.compile(optimizer = 'rmsprop',
loss = 'mse', loss = 'mse',
metrics = ['mae', 'mse'] ) metrics = ['mae', 'mse'] )
return model return model
``` ```
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 5 - Train the model ## Step 5 - Train the model
### 5.1 - Get it ### 5.1 - Get it
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
model=get_model_v1( (13,) ) model=get_model_v1( (13,) )
model.summary() model.summary()
keras.utils.plot_model( model, to_file='./run/model.png', show_shapes=True, show_layer_names=True, dpi=96) keras.utils.plot_model( model, to_file='./run/model.png', show_shapes=True, show_layer_names=True, dpi=96)
``` ```
%% Output %% Output
Model: "sequential" Model: "sequential"
_________________________________________________________________ _________________________________________________________________
Layer (type) Output Shape Param # Layer (type) Output Shape Param #
================================================================= =================================================================
Dense_n1 (Dense) (None, 64) 896 Dense_n1 (Dense) (None, 64) 896
_________________________________________________________________ _________________________________________________________________
Dense_n2 (Dense) (None, 64) 4160 Dense_n2 (Dense) (None, 64) 4160
_________________________________________________________________ _________________________________________________________________
Output (Dense) (None, 1) 65 Output (Dense) (None, 1) 65
================================================================= =================================================================
Total params: 5,121 Total params: 5,121
Trainable params: 5,121 Trainable params: 5,121
Non-trainable params: 0 Non-trainable params: 0
_________________________________________________________________ _________________________________________________________________
<IPython.core.display.Image object> <IPython.core.display.Image object>
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
### 5.2 - Train it ### 5.2 - Train it
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
history = model.fit(x_train, history = model.fit(x_train,
y_train, y_train,
epochs = 100, epochs = 100,
batch_size = 10, batch_size = 10,
verbose = 1, verbose = 1,
validation_data = (x_test, y_test)) validation_data = (x_test, y_test))
``` ```
%% Output %% Output
Train on 354 samples, validate on 152 samples Train on 354 samples, validate on 152 samples
Epoch 1/100 Epoch 1/100
354/354 [==============================] - 1s 2ms/sample - loss: 414.5603 - mae: 18.2577 - mse: 414.5602 - val_loss: 266.3728 - val_mae: 13.9913 - val_mse: 266.3728 354/354 [==============================] - 1s 2ms/sample - loss: 414.5603 - mae: 18.2577 - mse: 414.5602 - val_loss: 266.3728 - val_mae: 13.9913 - val_mse: 266.3728
Epoch 2/100 Epoch 2/100
354/354 [==============================] - 0s 190us/sample - loss: 165.4507 - mae: 10.4618 - mse: 165.4507 - val_loss: 74.4125 - val_mae: 6.0372 - val_mse: 74.4125 354/354 [==============================] - 0s 190us/sample - loss: 165.4507 - mae: 10.4618 - mse: 165.4507 - val_loss: 74.4125 - val_mae: 6.0372 - val_mse: 74.4125
Epoch 3/100 Epoch 3/100
354/354 [==============================] - 0s 187us/sample - loss: 54.2313 - mae: 5.3763 - mse: 54.2313 - val_loss: 47.0203 - val_mae: 4.7399 - val_mse: 47.0203 354/354 [==============================] - 0s 187us/sample - loss: 54.2313 - mae: 5.3763 - mse: 54.2313 - val_loss: 47.0203 - val_mae: 4.7399 - val_mse: 47.0203
Epoch 4/100 Epoch 4/100
354/354 [==============================] - 0s 166us/sample - loss: 32.3303 - mae: 4.2632 - mse: 32.3303 - val_loss: 38.0120 - val_mae: 4.2484 - val_mse: 38.0120 354/354 [==============================] - 0s 166us/sample - loss: 32.3303 - mae: 4.2632 - mse: 32.3303 - val_loss: 38.0120 - val_mae: 4.2484 - val_mse: 38.0120
Epoch 5/100 Epoch 5/100
354/354 [==============================] - 0s 153us/sample - loss: 25.3763 - mae: 3.7745 - mse: 25.3763 - val_loss: 32.4707 - val_mae: 3.8465 - val_mse: 32.4707 354/354 [==============================] - 0s 153us/sample - loss: 25.3763 - mae: 3.7745 - mse: 25.3763 - val_loss: 32.4707 - val_mae: 3.8465 - val_mse: 32.4707
Epoch 6/100 Epoch 6/100
354/354 [==============================] - 0s 153us/sample - loss: 22.2331 - mae: 3.4720 - mse: 22.2331 - val_loss: 29.6142 - val_mae: 3.4844 - val_mse: 29.6142 354/354 [==============================] - 0s 153us/sample - loss: 22.2331 - mae: 3.4720 - mse: 22.2331 - val_loss: 29.6142 - val_mae: 3.4844 - val_mse: 29.6142
Epoch 7/100 Epoch 7/100
354/354 [==============================] - 0s 154us/sample - loss: 19.7834 - mae: 3.2245 - mse: 19.7834 - val_loss: 27.1649 - val_mae: 3.5465 - val_mse: 27.1649 354/354 [==============================] - 0s 154us/sample - loss: 19.7834 - mae: 3.2245 - mse: 19.7834 - val_loss: 27.1649 - val_mae: 3.5465 - val_mse: 27.1649
Epoch 8/100 Epoch 8/100
354/354 [==============================] - 0s 155us/sample - loss: 18.0991 - mae: 3.0669 - mse: 18.0991 - val_loss: 26.0093 - val_mae: 3.5617 - val_mse: 26.0093 354/354 [==============================] - 0s 155us/sample - loss: 18.0991 - mae: 3.0669 - mse: 18.0991 - val_loss: 26.0093 - val_mae: 3.5617 - val_mse: 26.0093
Epoch 9/100 Epoch 9/100
354/354 [==============================] - 0s 161us/sample - loss: 16.9247 - mae: 2.9184 - mse: 16.9247 - val_loss: 23.2549 - val_mae: 3.3243 - val_mse: 23.2549 354/354 [==============================] - 0s 161us/sample - loss: 16.9247 - mae: 2.9184 - mse: 16.9247 - val_loss: 23.2549 - val_mae: 3.3243 - val_mse: 23.2549
Epoch 10/100 Epoch 10/100
354/354 [==============================] - 0s 150us/sample - loss: 16.0827 - mae: 2.8116 - mse: 16.0827 - val_loss: 21.1365 - val_mae: 3.0248 - val_mse: 21.1365 354/354 [==============================] - 0s 150us/sample - loss: 16.0827 - mae: 2.8116 - mse: 16.0827 - val_loss: 21.1365 - val_mae: 3.0248 - val_mse: 21.1365
Epoch 11/100 Epoch 11/100
354/354 [==============================] - 0s 170us/sample - loss: 15.0334 - mae: 2.7214 - mse: 15.0334 - val_loss: 20.0163 - val_mae: 2.9800 - val_mse: 20.0163 354/354 [==============================] - 0s 170us/sample - loss: 15.0334 - mae: 2.7214 - mse: 15.0334 - val_loss: 20.0163 - val_mae: 2.9800 - val_mse: 20.0163
Epoch 12/100 Epoch 12/100
354/354 [==============================] - 0s 180us/sample - loss: 14.4011 - mae: 2.6949 - mse: 14.4011 - val_loss: 19.8958 - val_mae: 2.9262 - val_mse: 19.8958 354/354 [==============================] - 0s 180us/sample - loss: 14.4011 - mae: 2.6949 - mse: 14.4011 - val_loss: 19.8958 - val_mae: 2.9262 - val_mse: 19.8958
Epoch 13/100 Epoch 13/100
354/354 [==============================] - 0s 184us/sample - loss: 13.9168 - mae: 2.5674 - mse: 13.9168 - val_loss: 18.5729 - val_mae: 2.7302 - val_mse: 18.5729 354/354 [==============================] - 0s 184us/sample - loss: 13.9168 - mae: 2.5674 - mse: 13.9168 - val_loss: 18.5729 - val_mae: 2.7302 - val_mse: 18.5729
Epoch 14/100 Epoch 14/100
354/354 [==============================] - 0s 161us/sample - loss: 13.5575 - mae: 2.5442 - mse: 13.5575 - val_loss: 17.8812 - val_mae: 2.6748 - val_mse: 17.8812 354/354 [==============================] - 0s 161us/sample - loss: 13.5575 - mae: 2.5442 - mse: 13.5575 - val_loss: 17.8812 - val_mae: 2.6748 - val_mse: 17.8812
Epoch 15/100 Epoch 15/100
354/354 [==============================] - 0s 166us/sample - loss: 12.8689 - mae: 2.4779 - mse: 12.8689 - val_loss: 18.9649 - val_mae: 2.7560 - val_mse: 18.9649 354/354 [==============================] - 0s 166us/sample - loss: 12.8689 - mae: 2.4779 - mse: 12.8689 - val_loss: 18.9649 - val_mae: 2.7560 - val_mse: 18.9649
Epoch 16/100 Epoch 16/100
354/354 [==============================] - 0s 159us/sample - loss: 12.6470 - mae: 2.4670 - mse: 12.6470 - val_loss: 16.5834 - val_mae: 2.6016 - val_mse: 16.5834 354/354 [==============================] - 0s 159us/sample - loss: 12.6470 - mae: 2.4670 - mse: 12.6470 - val_loss: 16.5834 - val_mae: 2.6016 - val_mse: 16.5834
Epoch 17/100 Epoch 17/100
354/354 [==============================] - 0s 159us/sample - loss: 12.3566 - mae: 2.4280 - mse: 12.3566 - val_loss: 16.7371 - val_mae: 2.6670 - val_mse: 16.7371 354/354 [==============================] - 0s 159us/sample - loss: 12.3566 - mae: 2.4280 - mse: 12.3566 - val_loss: 16.7371 - val_mae: 2.6670 - val_mse: 16.7371
Epoch 18/100 Epoch 18/100
354/354 [==============================] - 0s 158us/sample - loss: 12.3328 - mae: 2.4060 - mse: 12.3328 - val_loss: 16.3754 - val_mae: 2.6027 - val_mse: 16.3754 354/354 [==============================] - 0s 158us/sample - loss: 12.3328 - mae: 2.4060 - mse: 12.3328 - val_loss: 16.3754 - val_mae: 2.6027 - val_mse: 16.3754
Epoch 19/100 Epoch 19/100
354/354 [==============================] - 0s 152us/sample - loss: 11.8357 - mae: 2.3106 - mse: 11.8357 - val_loss: 16.1015 - val_mae: 2.6255 - val_mse: 16.1015 354/354 [==============================] - 0s 152us/sample - loss: 11.8357 - mae: 2.3106 - mse: 11.8357 - val_loss: 16.1015 - val_mae: 2.6255 - val_mse: 16.1015
Epoch 20/100 Epoch 20/100
354/354 [==============================] - 0s 163us/sample - loss: 11.6722 - mae: 2.3482 - mse: 11.6722 - val_loss: 16.1405 - val_mae: 2.6889 - val_mse: 16.1405 354/354 [==============================] - 0s 163us/sample - loss: 11.6722 - mae: 2.3482 - mse: 11.6722 - val_loss: 16.1405 - val_mae: 2.6889 - val_mse: 16.1405
Epoch 21/100 Epoch 21/100
354/354 [==============================] - 0s 175us/sample - loss: 11.2774 - mae: 2.3344 - mse: 11.2774 - val_loss: 15.2110 - val_mae: 2.5038 - val_mse: 15.2110 354/354 [==============================] - 0s 175us/sample - loss: 11.2774 - mae: 2.3344 - mse: 11.2774 - val_loss: 15.2110 - val_mae: 2.5038 - val_mse: 15.2110
Epoch 22/100 Epoch 22/100
354/354 [==============================] - 0s 180us/sample - loss: 11.2491 - mae: 2.3055 - mse: 11.2491 - val_loss: 15.4745 - val_mae: 2.4494 - val_mse: 15.4744 354/354 [==============================] - 0s 180us/sample - loss: 11.2491 - mae: 2.3055 - mse: 11.2491 - val_loss: 15.4745 - val_mae: 2.4494 - val_mse: 15.4744
Epoch 23/100 Epoch 23/100
354/354 [==============================] - 0s 187us/sample - loss: 10.9102 - mae: 2.2171 - mse: 10.9102 - val_loss: 15.1145 - val_mae: 2.4282 - val_mse: 15.1145 354/354 [==============================] - 0s 187us/sample - loss: 10.9102 - mae: 2.2171 - mse: 10.9102 - val_loss: 15.1145 - val_mae: 2.4282 - val_mse: 15.1145
Epoch 24/100 Epoch 24/100
354/354 [==============================] - 0s 168us/sample - loss: 10.7952 - mae: 2.2533 - mse: 10.7952 - val_loss: 14.3789 - val_mae: 2.3683 - val_mse: 14.3789 354/354 [==============================] - 0s 168us/sample - loss: 10.7952 - mae: 2.2533 - mse: 10.7952 - val_loss: 14.3789 - val_mae: 2.3683 - val_mse: 14.3789
Epoch 25/100 Epoch 25/100
354/354 [==============================] - 0s 171us/sample - loss: 10.7250 - mae: 2.2489 - mse: 10.7250 - val_loss: 15.1102 - val_mae: 2.3422 - val_mse: 15.1102 354/354 [==============================] - 0s 171us/sample - loss: 10.7250 - mae: 2.2489 - mse: 10.7250 - val_loss: 15.1102 - val_mae: 2.3422 - val_mse: 15.1102
Epoch 26/100 Epoch 26/100
354/354 [==============================] - 0s 158us/sample - loss: 10.4010 - mae: 2.1702 - mse: 10.4010 - val_loss: 14.3260 - val_mae: 2.3176 - val_mse: 14.3260 354/354 [==============================] - 0s 158us/sample - loss: 10.4010 - mae: 2.1702 - mse: 10.4010 - val_loss: 14.3260 - val_mae: 2.3176 - val_mse: 14.3260
Epoch 27/100 Epoch 27/100
354/354 [==============================] - 0s 149us/sample - loss: 10.1442 - mae: 2.1797 - mse: 10.1442 - val_loss: 13.6694 - val_mae: 2.3864 - val_mse: 13.6694 354/354 [==============================] - 0s 149us/sample - loss: 10.1442 - mae: 2.1797 - mse: 10.1442 - val_loss: 13.6694 - val_mae: 2.3864 - val_mse: 13.6694
Epoch 28/100 Epoch 28/100
354/354 [==============================] - 0s 168us/sample - loss: 10.1391 - mae: 2.1809 - mse: 10.1391 - val_loss: 14.0177 - val_mae: 2.3467 - val_mse: 14.0177 354/354 [==============================] - 0s 168us/sample - loss: 10.1391 - mae: 2.1809 - mse: 10.1391 - val_loss: 14.0177 - val_mae: 2.3467 - val_mse: 14.0177
Epoch 29/100 Epoch 29/100
354/354 [==============================] - 0s 149us/sample - loss: 9.9119 - mae: 2.1267 - mse: 9.9119 - val_loss: 14.0739 - val_mae: 2.4617 - val_mse: 14.0739 354/354 [==============================] - 0s 149us/sample - loss: 9.9119 - mae: 2.1267 - mse: 9.9119 - val_loss: 14.0739 - val_mae: 2.4617 - val_mse: 14.0739
Epoch 30/100 Epoch 30/100
354/354 [==============================] - 0s 164us/sample - loss: 10.0176 - mae: 2.1669 - mse: 10.0176 - val_loss: 13.5116 - val_mae: 2.3158 - val_mse: 13.5116 354/354 [==============================] - 0s 164us/sample - loss: 10.0176 - mae: 2.1669 - mse: 10.0176 - val_loss: 13.5116 - val_mae: 2.3158 - val_mse: 13.5116
Epoch 31/100 Epoch 31/100
354/354 [==============================] - 0s 189us/sample - loss: 9.8259 - mae: 2.1407 - mse: 9.8259 - val_loss: 13.7364 - val_mae: 2.3531 - val_mse: 13.7364 354/354 [==============================] - 0s 189us/sample - loss: 9.8259 - mae: 2.1407 - mse: 9.8259 - val_loss: 13.7364 - val_mae: 2.3531 - val_mse: 13.7364
Epoch 32/100 Epoch 32/100
354/354 [==============================] - 0s 178us/sample - loss: 9.4495 - mae: 2.0922 - mse: 9.4495 - val_loss: 14.1936 - val_mae: 2.3887 - val_mse: 14.1936 354/354 [==============================] - 0s 178us/sample - loss: 9.4495 - mae: 2.0922 - mse: 9.4495 - val_loss: 14.1936 - val_mae: 2.3887 - val_mse: 14.1936
Epoch 33/100 Epoch 33/100
354/354 [==============================] - 0s 164us/sample - loss: 9.6721 - mae: 2.0870 - mse: 9.6721 - val_loss: 13.4267 - val_mae: 2.3508 - val_mse: 13.4267 354/354 [==============================] - 0s 164us/sample - loss: 9.6721 - mae: 2.0870 - mse: 9.6721 - val_loss: 13.4267 - val_mae: 2.3508 - val_mse: 13.4267
Epoch 34/100 Epoch 34/100
354/354 [==============================] - 0s 167us/sample - loss: 9.1042 - mae: 2.0644 - mse: 9.1042 - val_loss: 13.3821 - val_mae: 2.4709 - val_mse: 13.3821 354/354 [==============================] - 0s 167us/sample - loss: 9.1042 - mae: 2.0644 - mse: 9.1042 - val_loss: 13.3821 - val_mae: 2.4709 - val_mse: 13.3821
Epoch 35/100 Epoch 35/100
354/354 [==============================] - 0s 155us/sample - loss: 9.0129 - mae: 2.0482 - mse: 9.0129 - val_loss: 14.2184 - val_mae: 2.2754 - val_mse: 14.2184 354/354 [==============================] - 0s 155us/sample - loss: 9.0129 - mae: 2.0482 - mse: 9.0129 - val_loss: 14.2184 - val_mae: 2.2754 - val_mse: 14.2184
Epoch 36/100 Epoch 36/100
354/354 [==============================] - 0s 160us/sample - loss: 9.2470 - mae: 2.0661 - mse: 9.2470 - val_loss: 14.3466 - val_mae: 2.5561 - val_mse: 14.3466 354/354 [==============================] - 0s 160us/sample - loss: 9.2470 - mae: 2.0661 - mse: 9.2470 - val_loss: 14.3466 - val_mae: 2.5561 - val_mse: 14.3466
Epoch 37/100 Epoch 37/100
354/354 [==============================] - 0s 169us/sample - loss: 9.1695 - mae: 2.0766 - mse: 9.1695 - val_loss: 13.3818 - val_mae: 2.2373 - val_mse: 13.3818 354/354 [==============================] - 0s 169us/sample - loss: 9.1695 - mae: 2.0766 - mse: 9.1695 - val_loss: 13.3818 - val_mae: 2.2373 - val_mse: 13.3818
Epoch 38/100 Epoch 38/100
354/354 [==============================] - 0s 165us/sample - loss: 9.1663 - mae: 2.0617 - mse: 9.1663 - val_loss: 14.7461 - val_mae: 2.5061 - val_mse: 14.7461 354/354 [==============================] - 0s 165us/sample - loss: 9.1663 - mae: 2.0617 - mse: 9.1663 - val_loss: 14.7461 - val_mae: 2.5061 - val_mse: 14.7461
Epoch 39/100 Epoch 39/100
354/354 [==============================] - 0s 159us/sample - loss: 8.7273 - mae: 2.0208 - mse: 8.7273 - val_loss: 12.5890 - val_mae: 2.3037 - val_mse: 12.5890 354/354 [==============================] - 0s 159us/sample - loss: 8.7273 - mae: 2.0208 - mse: 8.7273 - val_loss: 12.5890 - val_mae: 2.3037 - val_mse: 12.5890
Epoch 40/100 Epoch 40/100
354/354 [==============================] - 0s 166us/sample - loss: 8.9038 - mae: 2.0352 - mse: 8.9038 - val_loss: 12.9754 - val_mae: 2.2079 - val_mse: 12.9754 354/354 [==============================] - 0s 166us/sample - loss: 8.9038 - mae: 2.0352 - mse: 8.9038 - val_loss: 12.9754 - val_mae: 2.2079 - val_mse: 12.9754
Epoch 41/100 Epoch 41/100
354/354 [==============================] - 0s 153us/sample - loss: 8.6155 - mae: 2.0267 - mse: 8.6155 - val_loss: 13.9239 - val_mae: 2.3525 - val_mse: 13.9239 354/354 [==============================] - 0s 153us/sample - loss: 8.6155 - mae: 2.0267 - mse: 8.6155 - val_loss: 13.9239 - val_mae: 2.3525 - val_mse: 13.9239
Epoch 42/100 Epoch 42/100
354/354 [==============================] - 0s 163us/sample - loss: 8.5479 - mae: 2.0170 - mse: 8.5479 - val_loss: 13.6362 - val_mae: 2.2694 - val_mse: 13.6362 354/354 [==============================] - 0s 163us/sample - loss: 8.5479 - mae: 2.0170 - mse: 8.5479 - val_loss: 13.6362 - val_mae: 2.2694 - val_mse: 13.6362
Epoch 43/100 Epoch 43/100
354/354 [==============================] - 0s 165us/sample - loss: 8.7087 - mae: 2.0062 - mse: 8.7087 - val_loss: 13.1138 - val_mae: 2.2386 - val_mse: 13.1138 354/354 [==============================] - 0s 165us/sample - loss: 8.7087 - mae: 2.0062 - mse: 8.7087 - val_loss: 13.1138 - val_mae: 2.2386 - val_mse: 13.1138
Epoch 44/100 Epoch 44/100
354/354 [==============================] - 0s 160us/sample - loss: 8.3942 - mae: 1.9622 - mse: 8.3942 - val_loss: 12.3461 - val_mae: 2.2337 - val_mse: 12.3461 354/354 [==============================] - 0s 160us/sample - loss: 8.3942 - mae: 1.9622 - mse: 8.3942 - val_loss: 12.3461 - val_mae: 2.2337 - val_mse: 12.3461
Epoch 45/100 Epoch 45/100
354/354 [==============================] - 0s 168us/sample - loss: 8.4101 - mae: 2.0098 - mse: 8.4101 - val_loss: 13.2116 - val_mae: 2.2682 - val_mse: 13.2116 354/354 [==============================] - 0s 168us/sample - loss: 8.4101 - mae: 2.0098 - mse: 8.4101 - val_loss: 13.2116 - val_mae: 2.2682 - val_mse: 13.2116
Epoch 46/100 Epoch 46/100
354/354 [==============================] - 0s 156us/sample - loss: 8.3264 - mae: 1.9483 - mse: 8.3264 - val_loss: 12.5519 - val_mae: 2.4063 - val_mse: 12.5519 354/354 [==============================] - 0s 156us/sample - loss: 8.3264 - mae: 1.9483 - mse: 8.3264 - val_loss: 12.5519 - val_mae: 2.4063 - val_mse: 12.5519
Epoch 47/100 Epoch 47/100
354/354 [==============================] - 0s 158us/sample - loss: 8.1445 - mae: 1.9549 - mse: 8.1445 - val_loss: 12.1838 - val_mae: 2.2591 - val_mse: 12.1838 354/354 [==============================] - 0s 158us/sample - loss: 8.1445 - mae: 1.9549 - mse: 8.1445 - val_loss: 12.1838 - val_mae: 2.2591 - val_mse: 12.1838
Epoch 48/100 Epoch 48/100
354/354 [==============================] - 0s 156us/sample - loss: 8.0389 - mae: 1.9304 - mse: 8.0389 - val_loss: 12.6978 - val_mae: 2.1907 - val_mse: 12.6978 354/354 [==============================] - 0s 156us/sample - loss: 8.0389 - mae: 1.9304 - mse: 8.0389 - val_loss: 12.6978 - val_mae: 2.1907 - val_mse: 12.6978
Epoch 49/100 Epoch 49/100
354/354 [==============================] - 0s 164us/sample - loss: 8.0705 - mae: 1.9493 - mse: 8.0705 - val_loss: 12.4833 - val_mae: 2.4720 - val_mse: 12.4833 354/354 [==============================] - 0s 164us/sample - loss: 8.0705 - mae: 1.9493 - mse: 8.0705 - val_loss: 12.4833 - val_mae: 2.4720 - val_mse: 12.4833
Epoch 50/100 Epoch 50/100
354/354 [==============================] - 0s 158us/sample - loss: 8.1872 - mae: 1.9630 - mse: 8.1872 - val_loss: 12.0043 - val_mae: 2.2610 - val_mse: 12.0043 354/354 [==============================] - 0s 158us/sample - loss: 8.1872 - mae: 1.9630 - mse: 8.1872 - val_loss: 12.0043 - val_mae: 2.2610 - val_mse: 12.0043
Epoch 51/100 Epoch 51/100
354/354 [==============================] - 0s 158us/sample - loss: 8.0357 - mae: 1.8946 - mse: 8.0357 - val_loss: 11.3982 - val_mae: 2.1770 - val_mse: 11.3982 354/354 [==============================] - 0s 158us/sample - loss: 8.0357 - mae: 1.8946 - mse: 8.0357 - val_loss: 11.3982 - val_mae: 2.1770 - val_mse: 11.3982
Epoch 52/100 Epoch 52/100
354/354 [==============================] - 0s 162us/sample - loss: 7.6882 - mae: 1.8951 - mse: 7.6882 - val_loss: 13.0714 - val_mae: 2.4109 - val_mse: 13.0714 354/354 [==============================] - 0s 162us/sample - loss: 7.6882 - mae: 1.8951 - mse: 7.6882 - val_loss: 13.0714 - val_mae: 2.4109 - val_mse: 13.0714
Epoch 53/100 Epoch 53/100
354/354 [==============================] - 0s 162us/sample - loss: 7.9639 - mae: 1.9103 - mse: 7.9639 - val_loss: 12.4297 - val_mae: 2.2996 - val_mse: 12.4297 354/354 [==============================] - 0s 162us/sample - loss: 7.9639 - mae: 1.9103 - mse: 7.9639 - val_loss: 12.4297 - val_mae: 2.2996 - val_mse: 12.4297
Epoch 54/100 Epoch 54/100
354/354 [==============================] - 0s 183us/sample - loss: 7.7929 - mae: 1.8971 - mse: 7.7929 - val_loss: 11.9751 - val_mae: 2.2491 - val_mse: 11.9751 354/354 [==============================] - 0s 183us/sample - loss: 7.7929 - mae: 1.8971 - mse: 7.7929 - val_loss: 11.9751 - val_mae: 2.2491 - val_mse: 11.9751
Epoch 55/100 Epoch 55/100
354/354 [==============================] - 0s 185us/sample - loss: 7.4411 - mae: 1.8631 - mse: 7.4411 - val_loss: 11.3761 - val_mae: 2.3416 - val_mse: 11.3761 354/354 [==============================] - 0s 185us/sample - loss: 7.4411 - mae: 1.8631 - mse: 7.4411 - val_loss: 11.3761 - val_mae: 2.3416 - val_mse: 11.3761
Epoch 56/100 Epoch 56/100
354/354 [==============================] - 0s 186us/sample - loss: 7.6105 - mae: 1.9111 - mse: 7.6105 - val_loss: 12.4939 - val_mae: 2.4095 - val_mse: 12.4939 354/354 [==============================] - 0s 186us/sample - loss: 7.6105 - mae: 1.9111 - mse: 7.6105 - val_loss: 12.4939 - val_mae: 2.4095 - val_mse: 12.4939
Epoch 57/100 Epoch 57/100
354/354 [==============================] - 0s 190us/sample - loss: 7.5013 - mae: 1.9146 - mse: 7.5013 - val_loss: 11.6668 - val_mae: 2.1468 - val_mse: 11.6668 354/354 [==============================] - 0s 190us/sample - loss: 7.5013 - mae: 1.9146 - mse: 7.5013 - val_loss: 11.6668 - val_mae: 2.1468 - val_mse: 11.6668
Epoch 58/100 Epoch 58/100
354/354 [==============================] - 0s 195us/sample - loss: 7.4096 - mae: 1.8515 - mse: 7.4096 - val_loss: 13.8000 - val_mae: 2.5222 - val_mse: 13.8000 354/354 [==============================] - 0s 195us/sample - loss: 7.4096 - mae: 1.8515 - mse: 7.4096 - val_loss: 13.8000 - val_mae: 2.5222 - val_mse: 13.8000
Epoch 59/100 Epoch 59/100
354/354 [==============================] - 0s 180us/sample - loss: 7.2263 - mae: 1.8241 - mse: 7.2263 - val_loss: 10.8964 - val_mae: 2.2130 - val_mse: 10.8964 354/354 [==============================] - 0s 180us/sample - loss: 7.2263 - mae: 1.8241 - mse: 7.2263 - val_loss: 10.8964 - val_mae: 2.2130 - val_mse: 10.8964
Epoch 60/100 Epoch 60/100
354/354 [==============================] - 0s 161us/sample - loss: 7.1773 - mae: 1.8526 - mse: 7.1773 - val_loss: 10.7862 - val_mae: 2.1088 - val_mse: 10.7862 354/354 [==============================] - 0s 161us/sample - loss: 7.1773 - mae: 1.8526 - mse: 7.1773 - val_loss: 10.7862 - val_mae: 2.1088 - val_mse: 10.7862
Epoch 61/100 Epoch 61/100
354/354 [==============================] - 0s 165us/sample - loss: 7.0812 - mae: 1.8308 - mse: 7.0812 - val_loss: 10.8147 - val_mae: 2.3209 - val_mse: 10.8147 354/354 [==============================] - 0s 165us/sample - loss: 7.0812 - mae: 1.8308 - mse: 7.0812 - val_loss: 10.8147 - val_mae: 2.3209 - val_mse: 10.8147
Epoch 62/100 Epoch 62/100
354/354 [==============================] - 0s 155us/sample - loss: 7.2235 - mae: 1.8367 - mse: 7.2235 - val_loss: 11.0399 - val_mae: 2.2583 - val_mse: 11.0399 354/354 [==============================] - 0s 155us/sample - loss: 7.2235 - mae: 1.8367 - mse: 7.2235 - val_loss: 11.0399 - val_mae: 2.2583 - val_mse: 11.0399
Epoch 63/100 Epoch 63/100
354/354 [==============================] - 0s 155us/sample - loss: 7.0341 - mae: 1.8172 - mse: 7.0341 - val_loss: 10.9894 - val_mae: 2.1429 - val_mse: 10.9894 354/354 [==============================] - 0s 155us/sample - loss: 7.0341 - mae: 1.8172 - mse: 7.0341 - val_loss: 10.9894 - val_mae: 2.1429 - val_mse: 10.9894
Epoch 64/100 Epoch 64/100
354/354 [==============================] - 0s 157us/sample - loss: 6.8729 - mae: 1.7492 - mse: 6.8729 - val_loss: 10.5465 - val_mae: 2.1532 - val_mse: 10.5465 354/354 [==============================] - 0s 157us/sample - loss: 6.8729 - mae: 1.7492 - mse: 6.8729 - val_loss: 10.5465 - val_mae: 2.1532 - val_mse: 10.5465
Epoch 65/100 Epoch 65/100
354/354 [==============================] - 0s 164us/sample - loss: 6.9345 - mae: 1.7837 - mse: 6.9345 - val_loss: 11.5379 - val_mae: 2.1963 - val_mse: 11.5379 354/354 [==============================] - 0s 164us/sample - loss: 6.9345 - mae: 1.7837 - mse: 6.9345 - val_loss: 11.5379 - val_mae: 2.1963 - val_mse: 11.5379
Epoch 66/100 Epoch 66/100
354/354 [==============================] - 0s 166us/sample - loss: 6.8218 - mae: 1.7714 - mse: 6.8218 - val_loss: 10.1486 - val_mae: 2.1617 - val_mse: 10.1486 354/354 [==============================] - 0s 166us/sample - loss: 6.8218 - mae: 1.7714 - mse: 6.8218 - val_loss: 10.1486 - val_mae: 2.1617 - val_mse: 10.1486
Epoch 67/100 Epoch 67/100
354/354 [==============================] - 0s 157us/sample - loss: 6.8711 - mae: 1.8045 - mse: 6.8711 - val_loss: 10.3196 - val_mae: 2.2297 - val_mse: 10.3196 354/354 [==============================] - 0s 157us/sample - loss: 6.8711 - mae: 1.8045 - mse: 6.8711 - val_loss: 10.3196 - val_mae: 2.2297 - val_mse: 10.3196
Epoch 68/100 Epoch 68/100
354/354 [==============================] - 0s 162us/sample - loss: 6.7281 - mae: 1.7762 - mse: 6.7281 - val_loss: 11.2361 - val_mae: 2.2046 - val_mse: 11.2361 354/354 [==============================] - 0s 162us/sample - loss: 6.7281 - mae: 1.7762 - mse: 6.7281 - val_loss: 11.2361 - val_mae: 2.2046 - val_mse: 11.2361
Epoch 69/100 Epoch 69/100
354/354 [==============================] - 0s 158us/sample - loss: 6.5518 - mae: 1.7292 - mse: 6.5518 - val_loss: 10.2378 - val_mae: 2.1494 - val_mse: 10.2378 354/354 [==============================] - 0s 158us/sample - loss: 6.5518 - mae: 1.7292 - mse: 6.5518 - val_loss: 10.2378 - val_mae: 2.1494 - val_mse: 10.2378
Epoch 70/100 Epoch 70/100
354/354 [==============================] - 0s 161us/sample - loss: 6.6489 - mae: 1.7383 - mse: 6.6489 - val_loss: 11.1613 - val_mae: 2.2212 - val_mse: 11.1613 354/354 [==============================] - 0s 161us/sample - loss: 6.6489 - mae: 1.7383 - mse: 6.6489 - val_loss: 11.1613 - val_mae: 2.2212 - val_mse: 11.1613
Epoch 71/100 Epoch 71/100
354/354 [==============================] - 0s 176us/sample - loss: 6.5827 - mae: 1.7564 - mse: 6.5827 - val_loss: 10.0177 - val_mae: 2.2440 - val_mse: 10.0177 354/354 [==============================] - 0s 176us/sample - loss: 6.5827 - mae: 1.7564 - mse: 6.5827 - val_loss: 10.0177 - val_mae: 2.2440 - val_mse: 10.0177
Epoch 72/100 Epoch 72/100
354/354 [==============================] - 0s 168us/sample - loss: 6.3411 - mae: 1.7463 - mse: 6.3411 - val_loss: 10.7929 - val_mae: 2.1946 - val_mse: 10.7929 354/354 [==============================] - 0s 168us/sample - loss: 6.3411 - mae: 1.7463 - mse: 6.3411 - val_loss: 10.7929 - val_mae: 2.1946 - val_mse: 10.7929
Epoch 73/100 Epoch 73/100
354/354 [==============================] - 0s 163us/sample - loss: 6.3621 - mae: 1.7466 - mse: 6.3621 - val_loss: 9.7344 - val_mae: 2.1441 - val_mse: 9.7344 354/354 [==============================] - 0s 163us/sample - loss: 6.3621 - mae: 1.7466 - mse: 6.3621 - val_loss: 9.7344 - val_mae: 2.1441 - val_mse: 9.7344
Epoch 74/100 Epoch 74/100
354/354 [==============================] - 0s 158us/sample - loss: 6.2298 - mae: 1.7411 - mse: 6.2298 - val_loss: 11.2495 - val_mae: 2.1948 - val_mse: 11.2495 354/354 [==============================] - 0s 158us/sample - loss: 6.2298 - mae: 1.7411 - mse: 6.2298 - val_loss: 11.2495 - val_mae: 2.1948 - val_mse: 11.2495
Epoch 75/100 Epoch 75/100
354/354 [==============================] - 0s 159us/sample - loss: 6.3037 - mae: 1.7169 - mse: 6.3037 - val_loss: 10.1339 - val_mae: 2.1716 - val_mse: 10.1339 354/354 [==============================] - 0s 159us/sample - loss: 6.3037 - mae: 1.7169 - mse: 6.3037 - val_loss: 10.1339 - val_mae: 2.1716 - val_mse: 10.1339
Epoch 76/100 Epoch 76/100
354/354 [==============================] - 0s 158us/sample - loss: 6.0780 - mae: 1.6686 - mse: 6.0780 - val_loss: 11.9975 - val_mae: 2.3317 - val_mse: 11.9975 354/354 [==============================] - 0s 158us/sample - loss: 6.0780 - mae: 1.6686 - mse: 6.0780 - val_loss: 11.9975 - val_mae: 2.3317 - val_mse: 11.9975
Epoch 77/100 Epoch 77/100
354/354 [==============================] - 0s 165us/sample - loss: 6.3311 - mae: 1.7082 - mse: 6.3311 - val_loss: 11.6433 - val_mae: 2.2756 - val_mse: 11.6433 354/354 [==============================] - 0s 165us/sample - loss: 6.3311 - mae: 1.7082 - mse: 6.3311 - val_loss: 11.6433 - val_mae: 2.2756 - val_mse: 11.6433
Epoch 78/100 Epoch 78/100
354/354 [==============================] - 0s 155us/sample - loss: 6.0620 - mae: 1.6765 - mse: 6.0620 - val_loss: 13.0159 - val_mae: 2.5073 - val_mse: 13.0159 354/354 [==============================] - 0s 155us/sample - loss: 6.0620 - mae: 1.6765 - mse: 6.0620 - val_loss: 13.0159 - val_mae: 2.5073 - val_mse: 13.0159
Epoch 79/100 Epoch 79/100
354/354 [==============================] - 0s 167us/sample - loss: 6.1819 - mae: 1.7157 - mse: 6.1819 - val_loss: 10.1000 - val_mae: 2.1462 - val_mse: 10.1000 354/354 [==============================] - 0s 167us/sample - loss: 6.1819 - mae: 1.7157 - mse: 6.1819 - val_loss: 10.1000 - val_mae: 2.1462 - val_mse: 10.1000
Epoch 80/100 Epoch 80/100
354/354 [==============================] - 0s 158us/sample - loss: 5.9085 - mae: 1.6720 - mse: 5.9085 - val_loss: 11.7867 - val_mae: 2.5045 - val_mse: 11.7866 354/354 [==============================] - 0s 158us/sample - loss: 5.9085 - mae: 1.6720 - mse: 5.9085 - val_loss: 11.7867 - val_mae: 2.5045 - val_mse: 11.7866
Epoch 81/100 Epoch 81/100
354/354 [==============================] - 0s 168us/sample - loss: 6.0201 - mae: 1.6678 - mse: 6.0201 - val_loss: 10.8789 - val_mae: 2.3031 - val_mse: 10.8789 354/354 [==============================] - 0s 168us/sample - loss: 6.0201 - mae: 1.6678 - mse: 6.0201 - val_loss: 10.8789 - val_mae: 2.3031 - val_mse: 10.8789
Epoch 82/100 Epoch 82/100
354/354 [==============================] - 0s 159us/sample - loss: 6.1278 - mae: 1.6799 - mse: 6.1278 - val_loss: 9.8114 - val_mae: 2.1048 - val_mse: 9.8114 354/354 [==============================] - 0s 159us/sample - loss: 6.1278 - mae: 1.6799 - mse: 6.1278 - val_loss: 9.8114 - val_mae: 2.1048 - val_mse: 9.8114
Epoch 83/100 Epoch 83/100
354/354 [==============================] - 0s 150us/sample - loss: 5.6372 - mae: 1.6280 - mse: 5.6372 - val_loss: 10.0971 - val_mae: 2.1464 - val_mse: 10.0971 354/354 [==============================] - 0s 150us/sample - loss: 5.6372 - mae: 1.6280 - mse: 5.6372 - val_loss: 10.0971 - val_mae: 2.1464 - val_mse: 10.0971
Epoch 84/100 Epoch 84/100
354/354 [==============================] - 0s 153us/sample - loss: 5.9587 - mae: 1.6421 - mse: 5.9587 - val_loss: 9.4731 - val_mae: 2.1915 - val_mse: 9.4731 354/354 [==============================] - 0s 153us/sample - loss: 5.9587 - mae: 1.6421 - mse: 5.9587 - val_loss: 9.4731 - val_mae: 2.1915 - val_mse: 9.4731
Epoch 85/100 Epoch 85/100
354/354 [==============================] - 0s 158us/sample - loss: 5.6189 - mae: 1.6223 - mse: 5.6189 - val_loss: 9.9788 - val_mae: 2.3332 - val_mse: 9.9788 354/354 [==============================] - 0s 158us/sample - loss: 5.6189 - mae: 1.6223 - mse: 5.6189 - val_loss: 9.9788 - val_mae: 2.3332 - val_mse: 9.9788
Epoch 86/100 Epoch 86/100
354/354 [==============================] - 0s 158us/sample - loss: 5.8193 - mae: 1.6930 - mse: 5.8193 - val_loss: 10.4070 - val_mae: 2.1490 - val_mse: 10.4070 354/354 [==============================] - 0s 158us/sample - loss: 5.8193 - mae: 1.6930 - mse: 5.8193 - val_loss: 10.4070 - val_mae: 2.1490 - val_mse: 10.4070
Epoch 87/100 Epoch 87/100
354/354 [==============================] - 0s 155us/sample - loss: 5.5919 - mae: 1.6152 - mse: 5.5919 - val_loss: 9.9985 - val_mae: 2.2546 - val_mse: 9.9985 354/354 [==============================] - 0s 155us/sample - loss: 5.5919 - mae: 1.6152 - mse: 5.5919 - val_loss: 9.9985 - val_mae: 2.2546 - val_mse: 9.9985
Epoch 88/100 Epoch 88/100
354/354 [==============================] - 0s 160us/sample - loss: 5.6652 - mae: 1.6246 - mse: 5.6652 - val_loss: 9.1506 - val_mae: 2.0642 - val_mse: 9.1506 354/354 [==============================] - 0s 160us/sample - loss: 5.6652 - mae: 1.6246 - mse: 5.6652 - val_loss: 9.1506 - val_mae: 2.0642 - val_mse: 9.1506
Epoch 89/100 Epoch 89/100
354/354 [==============================] - 0s 157us/sample - loss: 5.6349 - mae: 1.6108 - mse: 5.6349 - val_loss: 9.8522 - val_mae: 2.0813 - val_mse: 9.8522 354/354 [==============================] - 0s 157us/sample - loss: 5.6349 - mae: 1.6108 - mse: 5.6349 - val_loss: 9.8522 - val_mae: 2.0813 - val_mse: 9.8522
Epoch 90/100 Epoch 90/100
354/354 [==============================] - 0s 159us/sample - loss: 5.6165 - mae: 1.6449 - mse: 5.6165 - val_loss: 9.1553 - val_mae: 2.0421 - val_mse: 9.1553 354/354 [==============================] - 0s 159us/sample - loss: 5.6165 - mae: 1.6449 - mse: 5.6165 - val_loss: 9.1553 - val_mae: 2.0421 - val_mse: 9.1553
Epoch 91/100 Epoch 91/100
354/354 [==============================] - 0s 161us/sample - loss: 5.5416 - mae: 1.6153 - mse: 5.5416 - val_loss: 10.4231 - val_mae: 2.2880 - val_mse: 10.4231 354/354 [==============================] - 0s 161us/sample - loss: 5.5416 - mae: 1.6153 - mse: 5.5416 - val_loss: 10.4231 - val_mae: 2.2880 - val_mse: 10.4231
Epoch 92/100 Epoch 92/100
354/354 [==============================] - 0s 158us/sample - loss: 5.3909 - mae: 1.5863 - mse: 5.3909 - val_loss: 8.8087 - val_mae: 2.1022 - val_mse: 8.8087 354/354 [==============================] - 0s 158us/sample - loss: 5.3909 - mae: 1.5863 - mse: 5.3909 - val_loss: 8.8087 - val_mae: 2.1022 - val_mse: 8.8087
Epoch 93/100 Epoch 93/100
354/354 [==============================] - 0s 155us/sample - loss: 5.3540 - mae: 1.5986 - mse: 5.3540 - val_loss: 9.6963 - val_mae: 2.1931 - val_mse: 9.6963 354/354 [==============================] - 0s 155us/sample - loss: 5.3540 - mae: 1.5986 - mse: 5.3540 - val_loss: 9.6963 - val_mae: 2.1931 - val_mse: 9.6963
Epoch 94/100 Epoch 94/100
354/354 [==============================] - 0s 161us/sample - loss: 5.3198 - mae: 1.6074 - mse: 5.3198 - val_loss: 9.1875 - val_mae: 2.1917 - val_mse: 9.1875 354/354 [==============================] - 0s 161us/sample - loss: 5.3198 - mae: 1.6074 - mse: 5.3198 - val_loss: 9.1875 - val_mae: 2.1917 - val_mse: 9.1875
Epoch 95/100 Epoch 95/100
354/354 [==============================] - 0s 165us/sample - loss: 5.2299 - mae: 1.5638 - mse: 5.2299 - val_loss: 8.8746 - val_mae: 2.1273 - val_mse: 8.8746 354/354 [==============================] - 0s 165us/sample - loss: 5.2299 - mae: 1.5638 - mse: 5.2299 - val_loss: 8.8746 - val_mae: 2.1273 - val_mse: 8.8746
Epoch 96/100 Epoch 96/100
354/354 [==============================] - 0s 163us/sample - loss: 5.2789 - mae: 1.5651 - mse: 5.2789 - val_loss: 9.7351 - val_mae: 2.2359 - val_mse: 9.7351 354/354 [==============================] - 0s 163us/sample - loss: 5.2789 - mae: 1.5651 - mse: 5.2789 - val_loss: 9.7351 - val_mae: 2.2359 - val_mse: 9.7351
Epoch 97/100 Epoch 97/100
354/354 [==============================] - 0s 153us/sample - loss: 5.3399 - mae: 1.6002 - mse: 5.3399 - val_loss: 9.7185 - val_mae: 2.1080 - val_mse: 9.7185 354/354 [==============================] - 0s 153us/sample - loss: 5.3399 - mae: 1.6002 - mse: 5.3399 - val_loss: 9.7185 - val_mae: 2.1080 - val_mse: 9.7185
Epoch 98/100 Epoch 98/100
354/354 [==============================] - 0s 159us/sample - loss: 5.0072 - mae: 1.5055 - mse: 5.0072 - val_loss: 8.3621 - val_mae: 2.0586 - val_mse: 8.3621 354/354 [==============================] - 0s 159us/sample - loss: 5.0072 - mae: 1.5055 - mse: 5.0072 - val_loss: 8.3621 - val_mae: 2.0586 - val_mse: 8.3621
Epoch 99/100 Epoch 99/100
354/354 [==============================] - 0s 156us/sample - loss: 5.2596 - mae: 1.5557 - mse: 5.2596 - val_loss: 8.6406 - val_mae: 2.0527 - val_mse: 8.6406 354/354 [==============================] - 0s 156us/sample - loss: 5.2596 - mae: 1.5557 - mse: 5.2596 - val_loss: 8.6406 - val_mae: 2.0527 - val_mse: 8.6406
Epoch 100/100 Epoch 100/100
354/354 [==============================] - 0s 159us/sample - loss: 5.0983 - mae: 1.5543 - mse: 5.0983 - val_loss: 8.4836 - val_mae: 2.0234 - val_mse: 8.4836 354/354 [==============================] - 0s 159us/sample - loss: 5.0983 - mae: 1.5543 - mse: 5.0983 - val_loss: 8.4836 - val_mae: 2.0234 - val_mse: 8.4836
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 6 - Evaluate ## Step 6 - Evaluate
### 6.1 - Model evaluation ### 6.1 - Model evaluation
MAE = Mean Absolute Error (between the labels and predictions) MAE = Mean Absolute Error (between the labels and predictions)
A mae equal to 3 represents an average error in prediction of $3k. A mae equal to 3 represents an average error in prediction of $3k.
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
score = model.evaluate(x_test, y_test, verbose=0) score = model.evaluate(x_test, y_test, verbose=0)
print('x_test / loss : {:5.4f}'.format(score[0])) print('x_test / loss : {:5.4f}'.format(score[0]))
print('x_test / mae : {:5.4f}'.format(score[1])) print('x_test / mae : {:5.4f}'.format(score[1]))
print('x_test / mse : {:5.4f}'.format(score[2])) print('x_test / mse : {:5.4f}'.format(score[2]))
``` ```
%% Output %% Output
x_test / loss : 8.4836 x_test / loss : 8.4836
x_test / mae : 2.0234 x_test / mae : 2.0234
x_test / mse : 8.4836 x_test / mse : 8.4836
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
### 6.2 - Training history ### 6.2 - Training history
What was the best result during our training ? What was the best result during our training ?
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
df=pd.DataFrame(data=history.history) df=pd.DataFrame(data=history.history)
df.describe() df.describe()
``` ```
%% Output %% Output
loss mae mse val_loss val_mae val_mse loss mae mse val_loss val_mae val_mse
count 100.000000 100.000000 100.000000 100.000000 100.000000 100.000000 count 100.000000 100.000000 100.000000 100.000000 100.000000 100.000000
mean 15.144930 2.312168 15.144930 17.019036 2.582618 17.019036 mean 15.144930 2.312168 15.144930 17.019036 2.582618 17.019036
std 43.707091 1.906713 43.707090 26.587745 1.288267 26.587746 std 43.707091 1.906713 43.707090 26.587745 1.288267 26.587746
min 5.007155 1.505515 5.007155 8.362053 2.023406 8.362053 min 5.007155 1.505515 5.007155 8.362053 2.023406 8.362053
25% 6.285225 1.716563 6.285225 10.419040 2.192718 10.419040 25% 6.285225 1.716563 6.285225 10.419040 2.192718 10.419040
50% 8.037316 1.922454 8.037317 12.488579 2.301342 12.488580 50% 8.037316 1.922454 8.037317 12.488579 2.301342 12.488580
75% 10.482029 2.189933 10.482029 14.470699 2.503943 14.470701 75% 10.482029 2.189933 10.482029 14.470699 2.503943 14.470701
max 414.560260 18.257650 414.560242 266.372801 13.991282 266.372803 max 414.560260 18.257650 414.560242 266.372801 13.991282 266.372803
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
print("min( val_mae ) : {:.4f}".format( min(history.history["val_mae"]) ) ) print("min( val_mae ) : {:.4f}".format( min(history.history["val_mae"]) ) )
``` ```
%% Output %% Output
min( val_mae ) : 2.0234 min( val_mae ) : 2.0234
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
ooo.plot_history(history, plot={'MSE' :['mse', 'val_mse'], ooo.plot_history(history, plot={'MSE' :['mse', 'val_mse'],
'MAE' :['mae', 'val_mae'], 'MAE' :['mae', 'val_mae'],
'LOSS':['loss','val_loss']}) 'LOSS':['loss','val_loss']})
``` ```
%% Output %% Output
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
## Step 7 - Make a prediction ## Step 7 - Make a prediction
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
my_data = [ 1.26425925, -0.48522739, 1.0436489 , -0.23112788, 1.37120745, my_data = [ 1.26425925, -0.48522739, 1.0436489 , -0.23112788, 1.37120745,
-2.14308942, 1.13489104, -1.06802005, 1.71189006, 1.57042287, -2.14308942, 1.13489104, -1.06802005, 1.71189006, 1.57042287,
0.77859951, 0.14769795, 2.7585581 ] 0.77859951, 0.14769795, 2.7585581 ]
real_price = 10.4 real_price = 10.4
my_data=np.array(my_data).reshape(1,13) my_data=np.array(my_data).reshape(1,13)
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` python ``` python
predictions = model.predict( my_data ) predictions = model.predict( my_data )
print("Prédiction : {:.2f} K$".format(predictions[0][0])) print("Prédiction : {:.2f} K$".format(predictions[0][0]))
print("Reality : {:.2f} K$".format(real_price)) print("Reality : {:.2f} K$".format(real_price))
``` ```
%% Output %% Output
Prédiction : 11.59 K$ Prédiction : 11.59 K$
Reality : 10.40 K$ Reality : 10.40 K$
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
--- ---
![](../fidle/img/00-Fidle-logo-01_s.png) ![](../fidle/img/00-Fidle-logo-01_s.png)
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment