first commit
This commit is contained in:
commit
c7a7f46865
BIN
docs/images/attribition_mask.png
Normal file
BIN
docs/images/attribition_mask.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 543 KiB |
BIN
docs/images/dataset_overview.png
Normal file
BIN
docs/images/dataset_overview.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 50 KiB |
BIN
docs/images/model_evaluation.png
Normal file
BIN
docs/images/model_evaluation.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 96 KiB |
BIN
docs/images/prediction.png
Normal file
BIN
docs/images/prediction.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 853 KiB |
90
venv/README.md
Normal file
90
venv/README.md
Normal file
|
|
@ -0,0 +1,90 @@
|
||||||
|
# Tensorflow Grapevine Disease Detection
|
||||||
|
|
||||||
|
This document outlines the development of a modile application that uses a DeepLearning model de detect diseases on grapevine.
|
||||||
|
|
||||||
|
## Dataset
|
||||||
|
|
||||||
|
The data used in this study is split into training, validation, and testing sets ensuring a robust evaluation of our model's performance. The dataset consists of a set of 9027 images of three disease commonly found on grapevines:
|
||||||
|
**Black Rot**, **ESCA**, and **Leaf Blight**, balanced with equal representation across the classes. Images are in .jpeg format with dimensions of 256x256 pixels.
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
|
## Model Structure
|
||||||
|
|
||||||
|
Our model is a Convolutional Neural Network (CNN) built using Keras API with TensorFlow backend. It includes several convolutional layers followed by batch normalization, ReLU activation function and max pooling for downsampling.
|
||||||
|
Dropout layers are used for regularization to prevent overfitting. The architecture details and parameters are as follows:
|
||||||
|
|
||||||
|
| Layer (type) | Output Shape | Param # |
|
||||||
|
|--------------------------------------|-----------------------------|----------|
|
||||||
|
| sequential | (None, 224, 224, 3) | 0 |
|
||||||
|
| conv2d | (None, 224, 224, 32) | 896 |
|
||||||
|
| batch_normalization | (None, 224, 224, 32) | 128 |
|
||||||
|
| conv2d_1 | (None, 224, 224, 32) | 9248 |
|
||||||
|
| batch_normalization_1 | (None, 224, 224, 32) | 128 |
|
||||||
|
| max_pooling2d | (None, 112, 112, 32) | 0 |
|
||||||
|
| dropout | (None, 112, 112, 32) | 0 |
|
||||||
|
| conv2d_2 | (None, 112, 112, 64) | 18496 |
|
||||||
|
| batch_normalization_2 | (None, 112, 112, 64) | 256 |
|
||||||
|
| conv2d_3 | (None, 112, 112, 64) | 36864 |
|
||||||
|
| batch_normalization_3 | (None, 112, 112, 64) | 256 |
|
||||||
|
| max_pooling2d_1 | (None, 56, 56, 64) | 0 |
|
||||||
|
| dropout_1 | (None, 56, 56, 64) | 0 |
|
||||||
|
| conv2d_4 | (None, 56, 56, 128) | 73728 |
|
||||||
|
| batch_normalization_4 | (None, 56, 56, 128) | 512 |
|
||||||
|
| conv2d_5 | (None, 56, 56, 128) | 147584|
|
||||||
|
| batch_normalization_5 | (None, 56, 56, 128) | 512 |
|
||||||
|
| max_pooling2d_2 | (None, 28, 28, 128) | 0 |
|
||||||
|
| dropout_2 | (None, 28, 28, 128) | 0 |
|
||||||
|
| conv2d_6 | (None, 28, 28, 256) | 294912|
|
||||||
|
| batch_normalization_6 | (None, 28, 28, 256) | 1024 |
|
||||||
|
| conv2d_7 | (None, 28, 28, 256) | 590080|
|
||||||
|
| batch_normalization_7 | (None, 28, 28, 256) | 1024 |
|
||||||
|
| max_pooling2d_3 | (None, 14, 14, 256) | 0 |
|
||||||
|
| dropout_3 | (None, 14, 14, 256) | 0 |
|
||||||
|
| global_average_pooling2d | (None, 256) | 0 |
|
||||||
|
| dense | (None, 256) | 65792 |
|
||||||
|
| batch_normalization_8 | (None, 256) | 1024 |
|
||||||
|
| dropout_4 | (None, 256) | 0 |
|
||||||
|
| dense_1 | (None, 128) | 32768 |
|
||||||
|
| batch_normalization_9 | (None, 128) | 512 |
|
||||||
|
| dropout_5 | (None, 128) | 0 |
|
||||||
|
| dense_2 | (None, 4) | 516 |
|
||||||
|
|
||||||
|
|
||||||
|
Total params: 3,825,134 (14.59 MB)
|
||||||
|
Trainable params: 1,274,148 (4.86 MB)
|
||||||
|
Non-trainable params: 2,688 (10.50 KB)
|
||||||
|
Optimizer params: 2,548,298 (9.72 MB)
|
||||||
|
|
||||||
|
## Training Details
|
||||||
|
|
||||||
|
Training was done using a batch size of 32 over 100 epochs. Data augmentation methods include horizontal/vertical flip (RandomFlip), rotation (RandomRotation), zooming (RandomZoom) and rescaling (Rescaling). Pixel values are
|
||||||
|
normalized to the range [0, 1] after loading.
|
||||||
|
|
||||||
|
## Results
|
||||||
|
|
||||||
|
Our best model's performance has an average accuracy of roughly 30% on the validation set. This suggests potential overfitting towards the **ESCA** class. However, the model can identify key features that distinguish all classes:
|
||||||
|
marks on the leaves (fig.4).
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
|
### Prediction Example
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
|
### Attribution Mask
|
||||||
|
|
||||||
|
The attribution mask provides an insight into what features the model has learned to extract from each image, which can be seen in figure 4. This can help guide future work on improving disease detection and understanding how the
|
||||||
|
model is identifying key features for accurate classification.
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
|
### ressources:
|
||||||
|
|
||||||
|
https://www.tensorflow.org/tutorials/images/classification?hl=en
|
||||||
|
https://www.tensorflow.org/lite/convert?hl=en
|
||||||
|
https://www.tensorflow.org/tutorials/interpretability/integrated_gradients?hl=en
|
||||||
|
|
||||||
|
AI(s) : deepseek-coder:6.7b | deepseek-r1:8b
|
||||||
|
|
||||||
|
|
||||||
BIN
venv/models/2026-03-14_21:21:01.562003/model.keras
Normal file
BIN
venv/models/2026-03-14_21:21:01.562003/model.keras
Normal file
Binary file not shown.
BIN
venv/models/2026-03-14_21:21:01.562003/model.tflite
Normal file
BIN
venv/models/2026-03-14_21:21:01.562003/model.tflite
Normal file
Binary file not shown.
101
venv/models/2026-03-14_21:21:01.562003/training_history.csv
Normal file
101
venv/models/2026-03-14_21:21:01.562003/training_history.csv
Normal file
|
|
@ -0,0 +1,101 @@
|
||||||
|
epoch,accuracy,val_accuracy,loss,val_loss
|
||||||
|
1,0.2874999940395355,0.22083333134651184,1.3928883075714111,1.3870069980621338
|
||||||
|
2,0.2562499940395355,0.22083333134651184,1.386470079421997,1.3864997625350952
|
||||||
|
3,0.2562499940395355,0.22083333134651184,1.3864995241165161,1.386467695236206
|
||||||
|
4,0.24375000596046448,0.22083333134651184,1.3865721225738525,1.3866424560546875
|
||||||
|
5,0.24375000596046448,0.22083333134651184,1.3863112926483154,1.3872629404067993
|
||||||
|
6,0.23125000298023224,0.22083333134651184,1.3868341445922852,1.3871817588806152
|
||||||
|
7,0.29374998807907104,0.21250000596046448,1.3859353065490723,1.387146234512329
|
||||||
|
8,0.265625,0.21250000596046448,1.3863201141357422,1.3872051239013672
|
||||||
|
9,0.20937499403953552,0.21250000596046448,1.3869106769561768,1.3870598077774048
|
||||||
|
10,0.25,0.21250000596046448,1.386128306388855,1.3869531154632568
|
||||||
|
11,0.234375,0.24583333730697632,1.3867542743682861,1.3865885734558105
|
||||||
|
12,0.24687500298023224,0.24583333730697632,1.3860838413238525,1.3861920833587646
|
||||||
|
13,0.26249998807907104,0.24583333730697632,1.3860362768173218,1.3861783742904663
|
||||||
|
14,0.2874999940395355,0.24583333730697632,1.3862178325653076,1.3861534595489502
|
||||||
|
15,0.19374999403953552,0.24583333730697632,1.3873660564422607,1.3863065242767334
|
||||||
|
16,0.26249998807907104,0.24583333730697632,1.3861452341079712,1.386446475982666
|
||||||
|
17,0.24062499403953552,0.24583333730697632,1.3866946697235107,1.386178970336914
|
||||||
|
18,0.22812500596046448,0.24583333730697632,1.3868298530578613,1.386321783065796
|
||||||
|
19,0.23125000298023224,0.24583333730697632,1.3862159252166748,1.3865718841552734
|
||||||
|
20,0.23125000298023224,0.21250000596046448,1.3862392902374268,1.3868924379348755
|
||||||
|
21,0.23125000298023224,0.21250000596046448,1.3866623640060425,1.3870328664779663
|
||||||
|
22,0.2593750059604645,0.24583333730697632,1.386700987815857,1.3867841958999634
|
||||||
|
23,0.234375,0.24583333730697632,1.3866440057754517,1.3865004777908325
|
||||||
|
24,0.25312501192092896,0.24583333730697632,1.386134147644043,1.386633038520813
|
||||||
|
25,0.24687500298023224,0.24583333730697632,1.3861768245697021,1.3871376514434814
|
||||||
|
26,0.25312501192092896,0.21250000596046448,1.3862738609313965,1.3873473405838013
|
||||||
|
27,0.2562499940395355,0.21250000596046448,1.385794997215271,1.3877582550048828
|
||||||
|
28,0.23749999701976776,0.24583333730697632,1.3863804340362549,1.3877415657043457
|
||||||
|
29,0.25,0.24583333730697632,1.3865516185760498,1.3876038789749146
|
||||||
|
30,0.29374998807907104,0.24583333730697632,1.385347604751587,1.3876357078552246
|
||||||
|
31,0.29374998807907104,0.24583333730697632,1.3854305744171143,1.3874577283859253
|
||||||
|
32,0.24062499403953552,0.24583333730697632,1.3864048719406128,1.387599229812622
|
||||||
|
33,0.2593750059604645,0.24583333730697632,1.3856818675994873,1.3879826068878174
|
||||||
|
34,0.296875,0.24583333730697632,1.3849812746047974,1.3882523775100708
|
||||||
|
35,0.21562500298023224,0.24583333730697632,1.3888026475906372,1.3879179954528809
|
||||||
|
36,0.23749999701976776,0.24583333730697632,1.3865363597869873,1.3871872425079346
|
||||||
|
37,0.234375,0.24583333730697632,1.3875354528427124,1.3863681554794312
|
||||||
|
38,0.21875,0.24583333730697632,1.3855953216552734,1.3861123323440552
|
||||||
|
39,0.265625,0.24583333730697632,1.3871783018112183,1.3861643075942993
|
||||||
|
40,0.29374998807907104,0.24583333730697632,1.3863810300827026,1.3861408233642578
|
||||||
|
41,0.22187499701976776,0.24583333730697632,1.3870265483856201,1.3859524726867676
|
||||||
|
42,0.21250000596046448,0.24583333730697632,1.3879438638687134,1.3861037492752075
|
||||||
|
43,0.296875,0.24583333730697632,1.3858661651611328,1.3863002061843872
|
||||||
|
44,0.24687500298023224,0.24583333730697632,1.3861980438232422,1.386702299118042
|
||||||
|
45,0.22812500596046448,0.24583333730697632,1.3865344524383545,1.3868530988693237
|
||||||
|
46,0.25312501192092896,0.24583333730697632,1.386344313621521,1.386861801147461
|
||||||
|
47,0.22812500596046448,0.21250000596046448,1.3870619535446167,1.3869612216949463
|
||||||
|
48,0.234375,0.21250000596046448,1.3867980241775513,1.3870489597320557
|
||||||
|
49,0.24375000596046448,0.21250000596046448,1.3865089416503906,1.3869459629058838
|
||||||
|
50,0.24687500298023224,0.21250000596046448,1.3859565258026123,1.3871378898620605
|
||||||
|
51,0.26875001192092896,0.21250000596046448,1.385866403579712,1.3876111507415771
|
||||||
|
52,0.24062499403953552,0.22083333134651184,1.3859233856201172,1.3879282474517822
|
||||||
|
53,0.2593750059604645,0.22083333134651184,1.386330008506775,1.3880022764205933
|
||||||
|
54,0.20937499403953552,0.22083333134651184,1.3867876529693604,1.3879626989364624
|
||||||
|
55,0.20000000298023224,0.21250000596046448,1.386905550956726,1.3877875804901123
|
||||||
|
56,0.2562499940395355,0.21250000596046448,1.3864434957504272,1.3877772092819214
|
||||||
|
57,0.234375,0.21250000596046448,1.3866490125656128,1.3876440525054932
|
||||||
|
58,0.26875001192092896,0.21250000596046448,1.386214017868042,1.387592077255249
|
||||||
|
59,0.22499999403953552,0.21250000596046448,1.3862558603286743,1.3877002000808716
|
||||||
|
60,0.26249998807907104,0.21250000596046448,1.3864154815673828,1.3876866102218628
|
||||||
|
61,0.24375000596046448,0.21250000596046448,1.3860387802124023,1.3879073858261108
|
||||||
|
62,0.2906250059604645,0.21250000596046448,1.3862212896347046,1.3879992961883545
|
||||||
|
63,0.2750000059604645,0.21250000596046448,1.3860191106796265,1.3880730867385864
|
||||||
|
64,0.26875001192092896,0.21250000596046448,1.3854824304580688,1.38829505443573
|
||||||
|
65,0.23125000298023224,0.21250000596046448,1.3873180150985718,1.388282299041748
|
||||||
|
66,0.22812500596046448,0.21250000596046448,1.3856871128082275,1.3882330656051636
|
||||||
|
67,0.28125,0.21250000596046448,1.3856661319732666,1.3886258602142334
|
||||||
|
68,0.25,0.21250000596046448,1.387736439704895,1.3884905576705933
|
||||||
|
69,0.234375,0.21250000596046448,1.3867969512939453,1.3881773948669434
|
||||||
|
70,0.24062499403953552,0.21250000596046448,1.3866196870803833,1.3879505395889282
|
||||||
|
71,0.234375,0.21250000596046448,1.3875775337219238,1.3875434398651123
|
||||||
|
72,0.22499999403953552,0.21250000596046448,1.3867005109786987,1.3870617151260376
|
||||||
|
73,0.24375000596046448,0.21250000596046448,1.3864340782165527,1.3867781162261963
|
||||||
|
74,0.21250000596046448,0.21250000596046448,1.3865838050842285,1.3863834142684937
|
||||||
|
75,0.23125000298023224,0.32083332538604736,1.3864120244979858,1.386002779006958
|
||||||
|
76,0.2593750059604645,0.32083332538604736,1.38605535030365,1.3858983516693115
|
||||||
|
77,0.203125,0.22083333134651184,1.3867683410644531,1.3858402967453003
|
||||||
|
78,0.23125000298023224,0.22083333134651184,1.3865760564804077,1.3860074281692505
|
||||||
|
79,0.22812500596046448,0.22083333134651184,1.3865236043930054,1.3861191272735596
|
||||||
|
80,0.21250000596046448,0.22083333134651184,1.3865149021148682,1.3863435983657837
|
||||||
|
81,0.24687500298023224,0.24583333730697632,1.3862627744674683,1.386601209640503
|
||||||
|
82,0.24062499403953552,0.24583333730697632,1.38639497756958,1.38667893409729
|
||||||
|
83,0.25312501192092896,0.24583333730697632,1.3862838745117188,1.3868708610534668
|
||||||
|
84,0.265625,0.24583333730697632,1.385940432548523,1.3870657682418823
|
||||||
|
85,0.23125000298023224,0.24583333730697632,1.3863862752914429,1.3870974779129028
|
||||||
|
86,0.24375000596046448,0.21250000596046448,1.386191725730896,1.3873549699783325
|
||||||
|
87,0.2750000059604645,0.21250000596046448,1.3859028816223145,1.3875572681427002
|
||||||
|
88,0.24687500298023224,0.21250000596046448,1.3861364126205444,1.3876681327819824
|
||||||
|
89,0.24687500298023224,0.21250000596046448,1.3861520290374756,1.3877639770507812
|
||||||
|
90,0.2750000059604645,0.21250000596046448,1.3857581615447998,1.3882124423980713
|
||||||
|
91,0.2593750059604645,0.21250000596046448,1.3867679834365845,1.388411045074463
|
||||||
|
92,0.234375,0.21250000596046448,1.3867148160934448,1.3881914615631104
|
||||||
|
93,0.25312501192092896,0.21250000596046448,1.3864551782608032,1.3878953456878662
|
||||||
|
94,0.22499999403953552,0.21250000596046448,1.3869973421096802,1.3876574039459229
|
||||||
|
95,0.24687500298023224,0.21250000596046448,1.3866239786148071,1.3872970342636108
|
||||||
|
96,0.23749999701976776,0.21250000596046448,1.386667013168335,1.3870171308517456
|
||||||
|
97,0.2718749940395355,0.21250000596046448,1.3860135078430176,1.3872432708740234
|
||||||
|
98,0.21562500298023224,0.21250000596046448,1.3871543407440186,1.3872439861297607
|
||||||
|
99,0.2718749940395355,0.21250000596046448,1.3863437175750732,1.3870137929916382
|
||||||
|
100,0.24687500298023224,0.21250000596046448,1.3862937688827515,1.3871190547943115
|
||||||
|
BIN
venv/models/2026-03-15_14:01:21.659459/model.keras
Normal file
BIN
venv/models/2026-03-15_14:01:21.659459/model.keras
Normal file
Binary file not shown.
BIN
venv/models/2026-03-15_14:01:21.659459/model.tflite
Normal file
BIN
venv/models/2026-03-15_14:01:21.659459/model.tflite
Normal file
Binary file not shown.
101
venv/models/2026-03-15_14:01:21.659459/training_history.csv
Normal file
101
venv/models/2026-03-15_14:01:21.659459/training_history.csv
Normal file
|
|
@ -0,0 +1,101 @@
|
||||||
|
epoch,accuracy,val_accuracy,loss,val_loss
|
||||||
|
1,0.19687500596046448,0.22083333134651184,1.389723300933838,1.3887017965316772
|
||||||
|
2,0.2874999940395355,0.22083333134651184,1.3862874507904053,1.388013243675232
|
||||||
|
3,0.24687500298023224,0.22083333134651184,1.3875197172164917,1.3871464729309082
|
||||||
|
4,0.24687500298023224,0.22083333134651184,1.386515736579895,1.3861252069473267
|
||||||
|
5,0.3187499940395355,0.22083333134651184,1.385441541671753,1.3883484601974487
|
||||||
|
6,0.21875,0.22083333134651184,1.3875070810317993,1.3902792930603027
|
||||||
|
7,0.30937498807907104,0.21250000596046448,1.3826572895050049,1.3944088220596313
|
||||||
|
8,0.2593750059604645,0.21250000596046448,1.3864555358886719,1.3978480100631714
|
||||||
|
9,0.24375000596046448,0.21250000596046448,1.3895992040634155,1.3934557437896729
|
||||||
|
10,0.26249998807907104,0.21250000596046448,1.3857975006103516,1.3911117315292358
|
||||||
|
11,0.26875001192092896,0.21250000596046448,1.386468768119812,1.3889323472976685
|
||||||
|
12,0.20624999701976776,0.21250000596046448,1.3885493278503418,1.3870673179626465
|
||||||
|
13,0.24375000596046448,0.21250000596046448,1.386535406112671,1.3860604763031006
|
||||||
|
14,0.25312501192092896,0.32083332538604736,1.3865668773651123,1.3856457471847534
|
||||||
|
15,0.2562499940395355,0.32083332538604736,1.3864099979400635,1.385008692741394
|
||||||
|
16,0.24062499403953552,0.24583333730697632,1.3862354755401611,1.3856245279312134
|
||||||
|
17,0.23125000298023224,0.24583333730697632,1.3865292072296143,1.386203646659851
|
||||||
|
18,0.21875,0.24583333730697632,1.3868465423583984,1.3863396644592285
|
||||||
|
19,0.29374998807907104,0.24583333730697632,1.3872582912445068,1.3859069347381592
|
||||||
|
20,0.25,0.24583333730697632,1.3865896463394165,1.385414958000183
|
||||||
|
21,0.24062499403953552,0.32083332538604736,1.3860137462615967,1.3849868774414062
|
||||||
|
22,0.21562500298023224,0.24583333730697632,1.3867892026901245,1.3853734731674194
|
||||||
|
23,0.2593750059604645,0.24583333730697632,1.3861404657363892,1.385751485824585
|
||||||
|
24,0.22499999403953552,0.24583333730697632,1.387131929397583,1.3858678340911865
|
||||||
|
25,0.25312501192092896,0.24583333730697632,1.3863983154296875,1.3862998485565186
|
||||||
|
26,0.24062499403953552,0.24583333730697632,1.38661789894104,1.3864539861679077
|
||||||
|
27,0.21250000596046448,0.24583333730697632,1.3869092464447021,1.3864970207214355
|
||||||
|
28,0.23749999701976776,0.21250000596046448,1.3864710330963135,1.386165976524353
|
||||||
|
29,0.23125000298023224,0.21250000596046448,1.3863639831542969,1.3865177631378174
|
||||||
|
30,0.30000001192092896,0.24583333730697632,1.385862112045288,1.3867548704147339
|
||||||
|
31,0.25312501192092896,0.24583333730697632,1.386470913887024,1.3866702318191528
|
||||||
|
32,0.21562500298023224,0.24583333730697632,1.387118935585022,1.3859220743179321
|
||||||
|
33,0.21250000596046448,0.32083332538604736,1.386586308479309,1.3854851722717285
|
||||||
|
34,0.21875,0.32083332538604736,1.3868935108184814,1.3858433961868286
|
||||||
|
35,0.28125,0.24583333730697632,1.3862088918685913,1.3862444162368774
|
||||||
|
36,0.27812498807907104,0.24583333730697632,1.3862977027893066,1.3865375518798828
|
||||||
|
37,0.25,0.24583333730697632,1.3863801956176758,1.3866711854934692
|
||||||
|
38,0.21562500298023224,0.24583333730697632,1.3869922161102295,1.386557936668396
|
||||||
|
39,0.20000000298023224,0.24583333730697632,1.3870608806610107,1.3864954710006714
|
||||||
|
40,0.20624999701976776,0.22083333134651184,1.3865511417388916,1.3864165544509888
|
||||||
|
41,0.2906250059604645,0.22083333134651184,1.38605797290802,1.3865432739257812
|
||||||
|
42,0.23749999701976776,0.22083333134651184,1.3865231275558472,1.3869646787643433
|
||||||
|
43,0.203125,0.22083333134651184,1.3867337703704834,1.3872078657150269
|
||||||
|
44,0.20000000298023224,0.21250000596046448,1.3870642185211182,1.3869264125823975
|
||||||
|
45,0.24375000596046448,0.21250000596046448,1.3860461711883545,1.386911153793335
|
||||||
|
46,0.234375,0.21250000596046448,1.3861843347549438,1.387214183807373
|
||||||
|
47,0.26249998807907104,0.24583333730697632,1.3867493867874146,1.3872864246368408
|
||||||
|
48,0.2593750059604645,0.24583333730697632,1.3859506845474243,1.3871232271194458
|
||||||
|
49,0.2718749940395355,0.24583333730697632,1.3862674236297607,1.386910319328308
|
||||||
|
50,0.265625,0.24583333730697632,1.386068344116211,1.3866286277770996
|
||||||
|
51,0.24062499403953552,0.24583333730697632,1.3868528604507446,1.3864555358886719
|
||||||
|
52,0.24062499403953552,0.24583333730697632,1.3864953517913818,1.3865829706192017
|
||||||
|
53,0.2593750059604645,0.24583333730697632,1.3861982822418213,1.3866260051727295
|
||||||
|
54,0.23125000298023224,0.24583333730697632,1.3866430521011353,1.3869823217391968
|
||||||
|
55,0.25,0.24583333730697632,1.3868688344955444,1.386829137802124
|
||||||
|
56,0.2874999940395355,0.24583333730697632,1.3858925104141235,1.3863166570663452
|
||||||
|
57,0.25312501192092896,0.24583333730697632,1.3864233493804932,1.3859808444976807
|
||||||
|
58,0.21875,0.24583333730697632,1.3872244358062744,1.3862658739089966
|
||||||
|
59,0.22187499701976776,0.24583333730697632,1.386713981628418,1.3866316080093384
|
||||||
|
60,0.22499999403953552,0.21250000596046448,1.3861124515533447,1.3873482942581177
|
||||||
|
61,0.24062499403953552,0.21250000596046448,1.3863258361816406,1.387614369392395
|
||||||
|
62,0.23749999701976776,0.21250000596046448,1.3863095045089722,1.3876338005065918
|
||||||
|
63,0.21250000596046448,0.21250000596046448,1.3874056339263916,1.3870995044708252
|
||||||
|
64,0.25312501192092896,0.21250000596046448,1.3861653804779053,1.387278437614441
|
||||||
|
65,0.265625,0.21250000596046448,1.3859654664993286,1.3876233100891113
|
||||||
|
66,0.23749999701976776,0.21250000596046448,1.386590600013733,1.3876007795333862
|
||||||
|
67,0.234375,0.21250000596046448,1.386126160621643,1.3879886865615845
|
||||||
|
68,0.23749999701976776,0.21250000596046448,1.3869259357452393,1.3880079984664917
|
||||||
|
69,0.23125000298023224,0.24583333730697632,1.3867313861846924,1.387492299079895
|
||||||
|
70,0.26875001192092896,0.24583333730697632,1.3861818313598633,1.3873966932296753
|
||||||
|
71,0.265625,0.24583333730697632,1.3858877420425415,1.3874586820602417
|
||||||
|
72,0.234375,0.24583333730697632,1.3868825435638428,1.3874166011810303
|
||||||
|
73,0.29374998807907104,0.24583333730697632,1.3856738805770874,1.38759446144104
|
||||||
|
74,0.23125000298023224,0.24583333730697632,1.3868231773376465,1.387603759765625
|
||||||
|
75,0.2750000059604645,0.24583333730697632,1.3852746486663818,1.3878364562988281
|
||||||
|
76,0.22499999403953552,0.24583333730697632,1.386884093284607,1.3880155086517334
|
||||||
|
77,0.24687500298023224,0.24583333730697632,1.3870716094970703,1.3878400325775146
|
||||||
|
78,0.27812498807907104,0.24583333730697632,1.3848199844360352,1.388013482093811
|
||||||
|
79,0.24687500298023224,0.24583333730697632,1.3875365257263184,1.3878871202468872
|
||||||
|
80,0.24062499403953552,0.24583333730697632,1.3874831199645996,1.3874707221984863
|
||||||
|
81,0.23125000298023224,0.24583333730697632,1.3863718509674072,1.3870768547058105
|
||||||
|
82,0.265625,0.24583333730697632,1.386584758758545,1.3863964080810547
|
||||||
|
83,0.2750000059604645,0.24583333730697632,1.3860197067260742,1.3862082958221436
|
||||||
|
84,0.2562499940395355,0.24583333730697632,1.3863346576690674,1.3864549398422241
|
||||||
|
85,0.265625,0.24583333730697632,1.3861913681030273,1.3863359689712524
|
||||||
|
86,0.24375000596046448,0.24583333730697632,1.3864843845367432,1.3865623474121094
|
||||||
|
87,0.22499999403953552,0.24583333730697632,1.386813759803772,1.3863391876220703
|
||||||
|
88,0.22812500596046448,0.24583333730697632,1.3866171836853027,1.3861021995544434
|
||||||
|
89,0.25,0.24583333730697632,1.3864729404449463,1.3860529661178589
|
||||||
|
90,0.3062500059604645,0.24583333730697632,1.385929822921753,1.3860437870025635
|
||||||
|
91,0.25,0.24583333730697632,1.3862428665161133,1.3859394788742065
|
||||||
|
92,0.24375000596046448,0.24583333730697632,1.3863451480865479,1.385769248008728
|
||||||
|
93,0.265625,0.24583333730697632,1.3860076665878296,1.3854719400405884
|
||||||
|
94,0.20937499403953552,0.24583333730697632,1.3874256610870361,1.385633945465088
|
||||||
|
95,0.22499999403953552,0.24583333730697632,1.3868390321731567,1.386364221572876
|
||||||
|
96,0.2562499940395355,0.21250000596046448,1.386042833328247,1.3872121572494507
|
||||||
|
97,0.2593750059604645,0.21250000596046448,1.385819673538208,1.3878296613693237
|
||||||
|
98,0.25,0.21250000596046448,1.3865140676498413,1.3881038427352905
|
||||||
|
99,0.27812498807907104,0.21250000596046448,1.3860855102539062,1.3881397247314453
|
||||||
|
100,0.24687500298023224,0.21250000596046448,1.38592529296875,1.3881205320358276
|
||||||
|
BIN
venv/models/2026-03-15_14:04:42.176752/model.keras
Normal file
BIN
venv/models/2026-03-15_14:04:42.176752/model.keras
Normal file
Binary file not shown.
BIN
venv/models/2026-03-15_14:04:42.176752/model.tflite
Normal file
BIN
venv/models/2026-03-15_14:04:42.176752/model.tflite
Normal file
Binary file not shown.
101
venv/models/2026-03-15_14:04:42.176752/training_history.csv
Normal file
101
venv/models/2026-03-15_14:04:42.176752/training_history.csv
Normal file
|
|
@ -0,0 +1,101 @@
|
||||||
|
epoch,accuracy,val_accuracy,loss,val_loss
|
||||||
|
1,0.24062499403953552,0.22083333134651184,1.387251853942871,1.429033637046814
|
||||||
|
2,0.2906250059604645,0.22083333134651184,1.3864779472351074,1.429295301437378
|
||||||
|
3,0.23749999701976776,0.22083333134651184,1.3872103691101074,1.424407720565796
|
||||||
|
4,0.26875001192092896,0.22083333134651184,1.386069416999817,1.4180697202682495
|
||||||
|
5,0.24062499403953552,0.22083333134651184,1.387187123298645,1.4150248765945435
|
||||||
|
6,0.20624999701976776,0.22083333134651184,1.3875205516815186,1.4103318452835083
|
||||||
|
7,0.24062499403953552,0.22083333134651184,1.3861851692199707,1.4081552028656006
|
||||||
|
8,0.24375000596046448,0.22083333134651184,1.3867741823196411,1.406940221786499
|
||||||
|
9,0.28125,0.22083333134651184,1.3862583637237549,1.406622052192688
|
||||||
|
10,0.24687500298023224,0.22083333134651184,1.386683702468872,1.4065757989883423
|
||||||
|
11,0.25,0.22083333134651184,1.386264443397522,1.406774640083313
|
||||||
|
12,0.265625,0.22083333134651184,1.3859189748764038,1.4072110652923584
|
||||||
|
13,0.30937498807907104,0.22083333134651184,1.3854365348815918,1.4073680639266968
|
||||||
|
14,0.2593750059604645,0.22083333134651184,1.3865587711334229,1.4068565368652344
|
||||||
|
15,0.2562499940395355,0.22083333134651184,1.3860880136489868,1.406276822090149
|
||||||
|
16,0.234375,0.22083333134651184,1.3869506120681763,1.4059096574783325
|
||||||
|
17,0.2593750059604645,0.22083333134651184,1.3863197565078735,1.4050337076187134
|
||||||
|
18,0.22499999403953552,0.22083333134651184,1.3870445489883423,1.4043165445327759
|
||||||
|
19,0.19374999403953552,0.22083333134651184,1.3872404098510742,1.404642939567566
|
||||||
|
20,0.24062499403953552,0.22083333134651184,1.3865021467208862,1.404237985610962
|
||||||
|
21,0.2593750059604645,0.22083333134651184,1.3862736225128174,1.4040597677230835
|
||||||
|
22,0.26249998807907104,0.22083333134651184,1.3863892555236816,1.4043409824371338
|
||||||
|
23,0.20624999701976776,0.22083333134651184,1.3869657516479492,1.4046180248260498
|
||||||
|
24,0.22812500596046448,0.22083333134651184,1.386696457862854,1.4048579931259155
|
||||||
|
25,0.23125000298023224,0.22083333134651184,1.3863697052001953,1.4054863452911377
|
||||||
|
26,0.23125000298023224,0.22083333134651184,1.3864532709121704,1.4056535959243774
|
||||||
|
27,0.234375,0.22083333134651184,1.3863270282745361,1.4062144756317139
|
||||||
|
28,0.25,0.22083333134651184,1.3863956928253174,1.4064122438430786
|
||||||
|
29,0.2593750059604645,0.22083333134651184,1.3864457607269287,1.4067738056182861
|
||||||
|
30,0.21562500298023224,0.22083333134651184,1.386589527130127,1.4062962532043457
|
||||||
|
31,0.23749999701976776,0.22083333134651184,1.3863712549209595,1.4059797525405884
|
||||||
|
32,0.22499999403953552,0.22083333134651184,1.38612699508667,1.4060200452804565
|
||||||
|
33,0.2562499940395355,0.22083333134651184,1.3863500356674194,1.4060051441192627
|
||||||
|
34,0.2593750059604645,0.22083333134651184,1.3862498998641968,1.406078815460205
|
||||||
|
35,0.2874999940395355,0.22083333134651184,1.3861627578735352,1.4055862426757812
|
||||||
|
36,0.24375000596046448,0.22083333134651184,1.3865219354629517,1.4052321910858154
|
||||||
|
37,0.24375000596046448,0.22083333134651184,1.3861771821975708,1.405773639678955
|
||||||
|
38,0.265625,0.22083333134651184,1.3859049081802368,1.406909704208374
|
||||||
|
39,0.25312501192092896,0.22083333134651184,1.3863964080810547,1.4073841571807861
|
||||||
|
40,0.20000000298023224,0.22083333134651184,1.387131929397583,1.4076370000839233
|
||||||
|
41,0.24375000596046448,0.22083333134651184,1.386378526687622,1.407658338546753
|
||||||
|
42,0.24062499403953552,0.22083333134651184,1.3862359523773193,1.40719735622406
|
||||||
|
43,0.28125,0.22083333134651184,1.385841965675354,1.4069615602493286
|
||||||
|
44,0.25312501192092896,0.22083333134651184,1.3859277963638306,1.4066978693008423
|
||||||
|
45,0.26875001192092896,0.22083333134651184,1.38688063621521,1.405705213546753
|
||||||
|
46,0.23125000298023224,0.22083333134651184,1.3860328197479248,1.4046095609664917
|
||||||
|
47,0.15937499701976776,0.22083333134651184,1.3879817724227905,1.4039548635482788
|
||||||
|
48,0.20937499403953552,0.22083333134651184,1.3875550031661987,1.4035561084747314
|
||||||
|
49,0.25312501192092896,0.22083333134651184,1.3857653141021729,1.403505563735962
|
||||||
|
50,0.24375000596046448,0.22083333134651184,1.3859742879867554,1.403362512588501
|
||||||
|
51,0.22187499701976776,0.22083333134651184,1.3860559463500977,1.4031425714492798
|
||||||
|
52,0.29374998807907104,0.22083333134651184,1.386060118675232,1.4028385877609253
|
||||||
|
53,0.265625,0.22083333134651184,1.386291742324829,1.403059482574463
|
||||||
|
54,0.22812500596046448,0.22083333134651184,1.386649489402771,1.4037810564041138
|
||||||
|
55,0.22499999403953552,0.22083333134651184,1.3861048221588135,1.4044525623321533
|
||||||
|
56,0.20000000298023224,0.22083333134651184,1.3873170614242554,1.4042528867721558
|
||||||
|
57,0.22499999403953552,0.22083333134651184,1.3870216608047485,1.404287338256836
|
||||||
|
58,0.22499999403953552,0.22083333134651184,1.3868658542633057,1.404625654220581
|
||||||
|
59,0.265625,0.22083333134651184,1.3865025043487549,1.4043896198272705
|
||||||
|
60,0.22499999403953552,0.22083333134651184,1.3864641189575195,1.4044140577316284
|
||||||
|
61,0.2562499940395355,0.22083333134651184,1.3861091136932373,1.4043464660644531
|
||||||
|
62,0.27812498807907104,0.22083333134651184,1.3858373165130615,1.4043923616409302
|
||||||
|
63,0.28437501192092896,0.22083333134651184,1.3863334655761719,1.4042212963104248
|
||||||
|
64,0.22499999403953552,0.22083333134651184,1.3861849308013916,1.4044967889785767
|
||||||
|
65,0.24375000596046448,0.22083333134651184,1.3866013288497925,1.404575228691101
|
||||||
|
66,0.3187499940395355,0.22083333134651184,1.3845323324203491,1.4045697450637817
|
||||||
|
67,0.22812500596046448,0.22083333134651184,1.3875443935394287,1.4041028022766113
|
||||||
|
68,0.22812500596046448,0.22083333134651184,1.387352466583252,1.403822660446167
|
||||||
|
69,0.2562499940395355,0.22083333134651184,1.3862857818603516,1.403423547744751
|
||||||
|
70,0.22812500596046448,0.22083333134651184,1.3866407871246338,1.4036400318145752
|
||||||
|
71,0.23749999701976776,0.22083333134651184,1.3864881992340088,1.4037724733352661
|
||||||
|
72,0.24375000596046448,0.22083333134651184,1.3865535259246826,1.4036052227020264
|
||||||
|
73,0.24375000596046448,0.22083333134651184,1.386453628540039,1.4031182527542114
|
||||||
|
74,0.23749999701976776,0.22083333134651184,1.3863955736160278,1.402764081954956
|
||||||
|
75,0.2750000059604645,0.22083333134651184,1.3863694667816162,1.4027035236358643
|
||||||
|
76,0.30000001192092896,0.22083333134651184,1.3861186504364014,1.4025993347167969
|
||||||
|
77,0.24687500298023224,0.22083333134651184,1.3864811658859253,1.403092622756958
|
||||||
|
78,0.24687500298023224,0.22083333134651184,1.386257529258728,1.4029167890548706
|
||||||
|
79,0.26249998807907104,0.22083333134651184,1.38616144657135,1.4029555320739746
|
||||||
|
80,0.23125000298023224,0.22083333134651184,1.3864988088607788,1.403173565864563
|
||||||
|
81,0.23125000298023224,0.22083333134651184,1.3865171670913696,1.4031702280044556
|
||||||
|
82,0.25,0.22083333134651184,1.3862680196762085,1.4028900861740112
|
||||||
|
83,0.24375000596046448,0.22083333134651184,1.3862940073013306,1.4026432037353516
|
||||||
|
84,0.24062499403953552,0.22083333134651184,1.3862216472625732,1.402282476425171
|
||||||
|
85,0.265625,0.22083333134651184,1.3866994380950928,1.4025509357452393
|
||||||
|
86,0.24375000596046448,0.22083333134651184,1.3863242864608765,1.4029487371444702
|
||||||
|
87,0.2593750059604645,0.22083333134651184,1.386634111404419,1.403193712234497
|
||||||
|
88,0.23749999701976776,0.22083333134651184,1.3866584300994873,1.4034322500228882
|
||||||
|
89,0.2874999940395355,0.22083333134651184,1.38583505153656,1.4038022756576538
|
||||||
|
90,0.23749999701976776,0.22083333134651184,1.3864247798919678,1.4037079811096191
|
||||||
|
91,0.26249998807907104,0.22083333134651184,1.3861095905303955,1.4036905765533447
|
||||||
|
92,0.23125000298023224,0.22083333134651184,1.3866643905639648,1.4035906791687012
|
||||||
|
93,0.2562499940395355,0.22083333134651184,1.3863781690597534,1.403332233428955
|
||||||
|
94,0.21875,0.22083333134651184,1.3863840103149414,1.4036800861358643
|
||||||
|
95,0.21562500298023224,0.22083333134651184,1.3868860006332397,1.4037765264511108
|
||||||
|
96,0.21562500298023224,0.22083333134651184,1.3867131471633911,1.4036279916763306
|
||||||
|
97,0.234375,0.22083333134651184,1.3866288661956787,1.403730034828186
|
||||||
|
98,0.30000001192092896,0.22083333134651184,1.3855899572372437,1.4040412902832031
|
||||||
|
99,0.24375000596046448,0.22083333134651184,1.3865249156951904,1.4040446281433105
|
||||||
|
100,0.29374998807907104,0.22083333134651184,1.3858128786087036,1.4040915966033936
|
||||||
|
BIN
venv/models/2026-03-15_14:40:09.291325/model.keras
Normal file
BIN
venv/models/2026-03-15_14:40:09.291325/model.keras
Normal file
Binary file not shown.
BIN
venv/models/2026-03-15_14:40:09.291325/model.tflite
Normal file
BIN
venv/models/2026-03-15_14:40:09.291325/model.tflite
Normal file
Binary file not shown.
101
venv/models/2026-03-15_14:40:09.291325/training_history.csv
Normal file
101
venv/models/2026-03-15_14:40:09.291325/training_history.csv
Normal file
|
|
@ -0,0 +1,101 @@
|
||||||
|
epoch,accuracy,val_accuracy,loss,val_loss
|
||||||
|
1,0.7641927003860474,0.15000000596046448,0.6556282639503479,2.786958932876587
|
||||||
|
2,0.875781238079071,0.24583333730697632,0.3367018401622772,145.1642608642578
|
||||||
|
3,0.9261718988418579,0.24583333730697632,0.21222706139087677,542.9489135742188
|
||||||
|
4,0.9468749761581421,0.3499999940395355,0.15502126514911652,436.6821594238281
|
||||||
|
5,0.95703125,0.30000001192092896,0.12713894248008728,1734.4005126953125
|
||||||
|
6,0.9653645753860474,0.22083333134651184,0.10625138133764267,2078.35595703125
|
||||||
|
7,0.9670572876930237,0.22083333134651184,0.10060538351535797,4190.84716796875
|
||||||
|
8,0.9708333611488342,0.22083333134651184,0.08701884001493454,2175.69384765625
|
||||||
|
9,0.9759114384651184,0.22083333134651184,0.07224109768867493,1431.79736328125
|
||||||
|
10,0.9756510257720947,0.22499999403953552,0.0758061408996582,1257.38818359375
|
||||||
|
11,0.9799479246139526,0.3291666805744171,0.06562892347574234,679.346923828125
|
||||||
|
12,0.9819010496139526,0.22083333134651184,0.05864816904067993,1250.117431640625
|
||||||
|
13,0.9837239384651184,0.22083333134651184,0.05603867396712303,949.5216064453125
|
||||||
|
14,0.9856770634651184,0.22083333134651184,0.04392676800489426,2813.852783203125
|
||||||
|
15,0.9815104007720947,0.22083333134651184,0.05770495906472206,992.2079467773438
|
||||||
|
16,0.983593761920929,0.22083333134651184,0.04698636755347252,2555.2177734375
|
||||||
|
17,0.983203113079071,0.3166666626930237,0.05349167808890343,721.4380493164062
|
||||||
|
18,0.9864583611488342,0.22499999403953552,0.046624429523944855,1216.3863525390625
|
||||||
|
19,0.9897135496139526,0.22083333134651184,0.033429499715566635,2209.611572265625
|
||||||
|
20,0.9885416626930237,0.22083333134651184,0.037510477006435394,1644.256591796875
|
||||||
|
21,0.9889323115348816,0.2874999940395355,0.032061509788036346,599.4248657226562
|
||||||
|
22,0.9888020753860474,0.22083333134651184,0.03519482910633087,2405.77587890625
|
||||||
|
23,0.9846354126930237,0.4833333194255829,0.048758625984191895,374.9194030761719
|
||||||
|
24,0.9895833134651184,0.23333333432674408,0.031843990087509155,946.0235595703125
|
||||||
|
25,0.9877604246139526,0.24166665971279144,0.0385715514421463,1195.2344970703125
|
||||||
|
26,0.9901041388511658,0.22083333134651184,0.032194558531045914,806.0076904296875
|
||||||
|
27,0.9880208373069763,0.24583333730697632,0.03730996325612068,688.756103515625
|
||||||
|
28,0.9912760257720947,0.22083333134651184,0.03252696618437767,1622.8123779296875
|
||||||
|
29,0.9947916865348816,0.22083333134651184,0.01784966140985489,1913.7918701171875
|
||||||
|
30,0.9924479126930237,0.22083333134651184,0.026797903701663017,279.2188720703125
|
||||||
|
31,0.9907552003860474,0.22083333134651184,0.033388420939445496,1134.2767333984375
|
||||||
|
32,0.9908854365348816,0.3375000059604645,0.029145648702979088,95.03201293945312
|
||||||
|
33,0.9891927242279053,0.42500001192092896,0.03102271445095539,464.6019592285156
|
||||||
|
34,0.9932291507720947,0.22083333134651184,0.021412037312984467,986.3841552734375
|
||||||
|
35,0.9923177361488342,0.22083333134651184,0.02759469673037529,760.4578857421875
|
||||||
|
36,0.991406261920929,0.24583333730697632,0.028778191655874252,593.8187255859375
|
||||||
|
37,0.9945312738418579,0.32083332538604736,0.018624553456902504,663.8523559570312
|
||||||
|
38,0.9908854365348816,0.22499999403953552,0.02799573726952076,767.1515502929688
|
||||||
|
39,0.9962239861488342,0.4375,0.013362539932131767,313.8023986816406
|
||||||
|
40,0.9946614503860474,0.22083333134651184,0.01853085868060589,1301.3148193359375
|
||||||
|
41,0.986328125,0.32083332538604736,0.04215172678232193,640.0283813476562
|
||||||
|
42,0.9925781488418579,0.23333333432674408,0.02235046960413456,218.9736328125
|
||||||
|
43,0.9944010376930237,0.4208333194255829,0.017718089744448662,202.29286193847656
|
||||||
|
44,0.9915364384651184,0.22499999403953552,0.025970684364438057,1598.172119140625
|
||||||
|
45,0.99609375,0.44999998807907104,0.014352566562592983,487.48809814453125
|
||||||
|
46,0.9934895634651184,0.30000001192092896,0.01648867316544056,996.599365234375
|
||||||
|
47,0.9924479126930237,0.24583333730697632,0.02568558044731617,1811.96630859375
|
||||||
|
48,0.9936197996139526,0.22083333134651184,0.019128015264868736,896.1229858398438
|
||||||
|
49,0.9970052242279053,0.24583333730697632,0.009085672907531261,1030.3626708984375
|
||||||
|
50,0.9944010376930237,0.24583333730697632,0.017151959240436554,1934.4542236328125
|
||||||
|
51,0.9916666746139526,0.24583333730697632,0.028664644807577133,983.9193725585938
|
||||||
|
52,0.9962239861488342,0.3791666626930237,0.01063383650034666,484.26690673828125
|
||||||
|
53,0.995312511920929,0.42500001192092896,0.016774259507656097,1236.3868408203125
|
||||||
|
54,0.9947916865348816,0.4333333373069763,0.015777425840497017,207.80308532714844
|
||||||
|
55,0.9962239861488342,0.3333333432674408,0.012974864803254604,494.0450439453125
|
||||||
|
56,0.9957031011581421,0.32083332538604736,0.012591647915542126,430.7264709472656
|
||||||
|
57,0.9966145753860474,0.5083333253860474,0.012390038929879665,194.0229034423828
|
||||||
|
58,0.9868489503860474,0.2291666716337204,0.05284683033823967,711.9586181640625
|
||||||
|
59,0.996874988079071,0.3291666805744171,0.008380413986742496,296.1285705566406
|
||||||
|
60,0.9959635138511658,0.3541666567325592,0.013547107577323914,247.80809020996094
|
||||||
|
61,0.997265636920929,0.2541666626930237,0.008858543820679188,664.0957641601562
|
||||||
|
62,0.9977864623069763,0.4541666805744171,0.007046550512313843,381.4072265625
|
||||||
|
63,0.9976562261581421,0.24166665971279144,0.009287163615226746,667.2242431640625
|
||||||
|
64,0.9951822757720947,0.4166666567325592,0.015782717615365982,180.54684448242188
|
||||||
|
65,0.998046875,0.24583333730697632,0.005884839221835136,1132.2967529296875
|
||||||
|
66,0.996874988079071,0.24583333730697632,0.008365819230675697,423.9919128417969
|
||||||
|
67,0.9944010376930237,0.4375,0.0157401654869318,591.289794921875
|
||||||
|
68,0.996874988079071,0.42500001192092896,0.009277629666030407,368.8995666503906
|
||||||
|
69,0.996874988079071,0.25833332538604736,0.009817498736083508,268.4747619628906
|
||||||
|
70,0.9981771111488342,0.32083332538604736,0.006176070775836706,848.0325927734375
|
||||||
|
71,0.9954426884651184,0.32083332538604736,0.017365239560604095,921.9395751953125
|
||||||
|
72,0.9976562261581421,0.19166666269302368,0.006243720185011625,846.9988403320312
|
||||||
|
73,0.9979166388511658,0.32083332538604736,0.007621175609529018,770.1184692382812
|
||||||
|
74,0.998046875,0.3333333432674408,0.007831843569874763,1054.477783203125
|
||||||
|
75,0.9934895634651184,0.32499998807907104,0.02367531694471836,1973.513671875
|
||||||
|
76,0.994140625,0.32083332538604736,0.02045782469213009,1825.21875
|
||||||
|
77,0.9985677003860474,0.1458333283662796,0.006224052980542183,2563.483154296875
|
||||||
|
78,0.9990885257720947,0.2916666567325592,0.0029171621426939964,1612.9859619140625
|
||||||
|
79,0.9996093511581421,0.3083333373069763,0.0023937856312841177,1210.00048828125
|
||||||
|
80,0.9973958134651184,0.32083332538604736,0.00871509313583374,2131.439453125
|
||||||
|
81,0.996874988079071,0.32083332538604736,0.009170063771307468,1381.1668701171875
|
||||||
|
82,0.9984375238418579,0.3499999940395355,0.006568868178874254,1278.370361328125
|
||||||
|
83,0.9977864623069763,0.32083332538604736,0.006044364999979734,1442.549072265625
|
||||||
|
84,0.9977864623069763,0.36666667461395264,0.006215202622115612,1152.5875244140625
|
||||||
|
85,0.9954426884651184,0.3541666567325592,0.015190036036074162,1440.297607421875
|
||||||
|
86,0.9959635138511658,0.32083332538604736,0.013142507523298264,1981.55908203125
|
||||||
|
87,0.9973958134651184,0.32499998807907104,0.008220557123422623,753.7999267578125
|
||||||
|
88,0.998046875,0.3708333373069763,0.007407734636217356,1276.0592041015625
|
||||||
|
89,0.996874988079071,0.32083332538604736,0.011465544812381268,2005.687255859375
|
||||||
|
90,0.9979166388511658,0.3125,0.009250237606465816,1785.6741943359375
|
||||||
|
91,0.9977864623069763,0.3583333194255829,0.0066549344919621944,2299.27294921875
|
||||||
|
92,0.998046875,0.3291666805744171,0.007205411791801453,1235.2969970703125
|
||||||
|
93,0.9946614503860474,0.32083332538604736,0.017886042594909668,1514.164306640625
|
||||||
|
94,0.9993489384651184,0.3958333432674408,0.0025855659041553736,1014.2952880859375
|
||||||
|
95,0.9989583492279053,0.3375000059604645,0.004219961352646351,860.9890747070312
|
||||||
|
96,0.9975260496139526,0.375,0.009455768391489983,1195.864990234375
|
||||||
|
97,0.9984375238418579,0.32083332538604736,0.005296614952385426,1493.249267578125
|
||||||
|
98,0.9958333373069763,0.3708333373069763,0.01711571030318737,1151.3814697265625
|
||||||
|
99,0.9979166388511658,0.32083332538604736,0.007663541007786989,1383.2186279296875
|
||||||
|
100,0.9990885257720947,0.32083332538604736,0.0027740655932575464,1476.4110107421875
|
||||||
|
5
venv/pyvenv.cfg
Normal file
5
venv/pyvenv.cfg
Normal file
|
|
@ -0,0 +1,5 @@
|
||||||
|
home = /usr/bin
|
||||||
|
include-system-site-packages = false
|
||||||
|
version = 3.12.3
|
||||||
|
executable = /usr/bin/python3.12
|
||||||
|
command = /usr/bin/python -m venv /home/jhodi/bit/Python/Grapevine_Pathology_Detection/venv
|
||||||
225
venv/share/man/man1/ttx.1
Normal file
225
venv/share/man/man1/ttx.1
Normal file
|
|
@ -0,0 +1,225 @@
|
||||||
|
.Dd May 18, 2004
|
||||||
|
.\" ttx is not specific to any OS, but contrary to what groff_mdoc(7)
|
||||||
|
.\" seems to imply, entirely omitting the .Os macro causes 'BSD' to
|
||||||
|
.\" be used, so I give a zero-width space as its argument.
|
||||||
|
.Os \&
|
||||||
|
.\" The "FontTools Manual" argument apparently has no effect in
|
||||||
|
.\" groff 1.18.1. I think it is a bug in the -mdoc groff package.
|
||||||
|
.Dt TTX 1 "FontTools Manual"
|
||||||
|
.Sh NAME
|
||||||
|
.Nm ttx
|
||||||
|
.Nd tool for manipulating TrueType and OpenType fonts
|
||||||
|
.Sh SYNOPSIS
|
||||||
|
.Nm
|
||||||
|
.Bk
|
||||||
|
.Op Ar option ...
|
||||||
|
.Ek
|
||||||
|
.Bk
|
||||||
|
.Ar file ...
|
||||||
|
.Ek
|
||||||
|
.Sh DESCRIPTION
|
||||||
|
.Nm
|
||||||
|
is a tool for manipulating TrueType and OpenType fonts. It can convert
|
||||||
|
TrueType and OpenType fonts to and from an
|
||||||
|
.Tn XML Ns -based format called
|
||||||
|
.Tn TTX .
|
||||||
|
.Tn TTX
|
||||||
|
files have a
|
||||||
|
.Ql .ttx
|
||||||
|
extension.
|
||||||
|
.Pp
|
||||||
|
For each
|
||||||
|
.Ar file
|
||||||
|
argument it is given,
|
||||||
|
.Nm
|
||||||
|
detects whether it is a
|
||||||
|
.Ql .ttf ,
|
||||||
|
.Ql .otf
|
||||||
|
or
|
||||||
|
.Ql .ttx
|
||||||
|
file and acts accordingly: if it is a
|
||||||
|
.Ql .ttf
|
||||||
|
or
|
||||||
|
.Ql .otf
|
||||||
|
file, it generates a
|
||||||
|
.Ql .ttx
|
||||||
|
file; if it is a
|
||||||
|
.Ql .ttx
|
||||||
|
file, it generates a
|
||||||
|
.Ql .ttf
|
||||||
|
or
|
||||||
|
.Ql .otf
|
||||||
|
file.
|
||||||
|
.Pp
|
||||||
|
By default, every output file is created in the same directory as the
|
||||||
|
corresponding input file and with the same name except for the
|
||||||
|
extension, which is substituted appropriately.
|
||||||
|
.Nm
|
||||||
|
never overwrites existing files; if necessary, it appends a suffix to
|
||||||
|
the output file name before the extension, as in
|
||||||
|
.Pa Arial#1.ttf .
|
||||||
|
.Ss "General options"
|
||||||
|
.Bl -tag -width ".Fl t Ar table"
|
||||||
|
.It Fl h
|
||||||
|
Display usage information.
|
||||||
|
.It Fl d Ar dir
|
||||||
|
Write the output files to directory
|
||||||
|
.Ar dir
|
||||||
|
instead of writing every output file to the same directory as the
|
||||||
|
corresponding input file.
|
||||||
|
.It Fl o Ar file
|
||||||
|
Write the output to
|
||||||
|
.Ar file
|
||||||
|
instead of writing it to the same directory as the
|
||||||
|
corresponding input file.
|
||||||
|
.It Fl v
|
||||||
|
Be verbose. Write more messages to the standard output describing what
|
||||||
|
is being done.
|
||||||
|
.It Fl a
|
||||||
|
Allow virtual glyphs ID's on compile or decompile.
|
||||||
|
.El
|
||||||
|
.Ss "Dump options"
|
||||||
|
The following options control the process of dumping font files
|
||||||
|
(TrueType or OpenType) to
|
||||||
|
.Tn TTX
|
||||||
|
files.
|
||||||
|
.Bl -tag -width ".Fl t Ar table"
|
||||||
|
.It Fl l
|
||||||
|
List table information. Instead of dumping the font to a
|
||||||
|
.Tn TTX
|
||||||
|
file, display minimal information about each table.
|
||||||
|
.It Fl t Ar table
|
||||||
|
Dump table
|
||||||
|
.Ar table .
|
||||||
|
This option may be given multiple times to dump several tables at
|
||||||
|
once. When not specified, all tables are dumped.
|
||||||
|
.It Fl x Ar table
|
||||||
|
Exclude table
|
||||||
|
.Ar table
|
||||||
|
from the list of tables to dump. This option may be given multiple
|
||||||
|
times to exclude several tables from the dump. The
|
||||||
|
.Fl t
|
||||||
|
and
|
||||||
|
.Fl x
|
||||||
|
options are mutually exclusive.
|
||||||
|
.It Fl s
|
||||||
|
Split tables. Dump each table to a separate
|
||||||
|
.Tn TTX
|
||||||
|
file and write (under the name that would have been used for the output
|
||||||
|
file if the
|
||||||
|
.Fl s
|
||||||
|
option had not been given) one small
|
||||||
|
.Tn TTX
|
||||||
|
file containing references to the individual table dump files. This
|
||||||
|
file can be used as input to
|
||||||
|
.Nm
|
||||||
|
as long as the referenced files can be found in the same directory.
|
||||||
|
.It Fl i
|
||||||
|
.\" XXX: I suppose OpenType programs (exist and) are also affected.
|
||||||
|
Don't disassemble TrueType instructions. When this option is specified,
|
||||||
|
all TrueType programs (glyph programs, the font program and the
|
||||||
|
pre-program) are written to the
|
||||||
|
.Tn TTX
|
||||||
|
file as hexadecimal data instead of
|
||||||
|
assembly. This saves some time and results in smaller
|
||||||
|
.Tn TTX
|
||||||
|
files.
|
||||||
|
.It Fl y Ar n
|
||||||
|
When decompiling a TrueType Collection (TTC) file,
|
||||||
|
decompile font number
|
||||||
|
.Ar n ,
|
||||||
|
starting from 0.
|
||||||
|
.El
|
||||||
|
.Ss "Compilation options"
|
||||||
|
The following options control the process of compiling
|
||||||
|
.Tn TTX
|
||||||
|
files into font files (TrueType or OpenType):
|
||||||
|
.Bl -tag -width ".Fl t Ar table"
|
||||||
|
.It Fl m Ar fontfile
|
||||||
|
Merge the input
|
||||||
|
.Tn TTX
|
||||||
|
file
|
||||||
|
.Ar file
|
||||||
|
with
|
||||||
|
.Ar fontfile .
|
||||||
|
No more than one
|
||||||
|
.Ar file
|
||||||
|
argument can be specified when this option is used.
|
||||||
|
.It Fl b
|
||||||
|
Don't recalculate glyph bounding boxes. Use the values in the
|
||||||
|
.Tn TTX
|
||||||
|
file as is.
|
||||||
|
.El
|
||||||
|
.Sh "THE TTX FILE FORMAT"
|
||||||
|
You can find some information about the
|
||||||
|
.Tn TTX
|
||||||
|
file format in
|
||||||
|
.Pa documentation.html .
|
||||||
|
In particular, you will find in that file the list of tables understood by
|
||||||
|
.Nm
|
||||||
|
and the relations between TrueType GlyphIDs and the glyph names used in
|
||||||
|
.Tn TTX
|
||||||
|
files.
|
||||||
|
.Sh EXAMPLES
|
||||||
|
In the following examples, all files are read from and written to the
|
||||||
|
current directory. Additionally, the name given for the output file
|
||||||
|
assumes in every case that it did not exist before
|
||||||
|
.Nm
|
||||||
|
was invoked.
|
||||||
|
.Pp
|
||||||
|
Dump the TrueType font contained in
|
||||||
|
.Pa FreeSans.ttf
|
||||||
|
to
|
||||||
|
.Pa FreeSans.ttx :
|
||||||
|
.Pp
|
||||||
|
.Dl ttx FreeSans.ttf
|
||||||
|
.Pp
|
||||||
|
Compile
|
||||||
|
.Pa MyFont.ttx
|
||||||
|
into a TrueType or OpenType font file:
|
||||||
|
.Pp
|
||||||
|
.Dl ttx MyFont.ttx
|
||||||
|
.Pp
|
||||||
|
List the tables in
|
||||||
|
.Pa FreeSans.ttf
|
||||||
|
along with some information:
|
||||||
|
.Pp
|
||||||
|
.Dl ttx -l FreeSans.ttf
|
||||||
|
.Pp
|
||||||
|
Dump the
|
||||||
|
.Sq cmap
|
||||||
|
table from
|
||||||
|
.Pa FreeSans.ttf
|
||||||
|
to
|
||||||
|
.Pa FreeSans.ttx :
|
||||||
|
.Pp
|
||||||
|
.Dl ttx -t cmap FreeSans.ttf
|
||||||
|
.Sh NOTES
|
||||||
|
On MS\-Windows and MacOS,
|
||||||
|
.Nm
|
||||||
|
is available as a graphical application to which files can be dropped.
|
||||||
|
.Sh SEE ALSO
|
||||||
|
.Pa documentation.html
|
||||||
|
.Pp
|
||||||
|
.Xr fontforge 1 ,
|
||||||
|
.Xr ftinfo 1 ,
|
||||||
|
.Xr gfontview 1 ,
|
||||||
|
.Xr xmbdfed 1 ,
|
||||||
|
.Xr Font::TTF 3pm
|
||||||
|
.Sh AUTHORS
|
||||||
|
.Nm
|
||||||
|
was written by
|
||||||
|
.An -nosplit
|
||||||
|
.An "Just van Rossum" Aq just@letterror.com .
|
||||||
|
.Pp
|
||||||
|
This manual page was written by
|
||||||
|
.An "Florent Rougon" Aq f.rougon@free.fr
|
||||||
|
for the Debian GNU/Linux system based on the existing FontTools
|
||||||
|
documentation. It may be freely used, modified and distributed without
|
||||||
|
restrictions.
|
||||||
|
.\" For Emacs:
|
||||||
|
.\" Local Variables:
|
||||||
|
.\" fill-column: 72
|
||||||
|
.\" sentence-end: "[.?!][]\"')}]*\\($\\| $\\| \\| \\)[ \n]*"
|
||||||
|
.\" sentence-end-double-space: t
|
||||||
|
.\" End:
|
||||||
BIN
venv/src/__pycache__/data_pretreat.cpython-312.pyc
Normal file
BIN
venv/src/__pycache__/data_pretreat.cpython-312.pyc
Normal file
Binary file not shown.
BIN
venv/src/__pycache__/load_model.cpython-312.pyc
Normal file
BIN
venv/src/__pycache__/load_model.cpython-312.pyc
Normal file
Binary file not shown.
BIN
venv/src/__pycache__/model_train.cpython-312.pyc
Normal file
BIN
venv/src/__pycache__/model_train.cpython-312.pyc
Normal file
Binary file not shown.
59
venv/src/data_explore.py
Normal file
59
venv/src/data_explore.py
Normal file
|
|
@ -0,0 +1,59 @@
|
||||||
|
import os
|
||||||
|
import matplotlib.pyplot as plt
|
||||||
|
import numpy as np
|
||||||
|
|
||||||
|
# Configuration
|
||||||
|
data_dir = os.getcwd()[:-9] + "/data/datasplit/"
|
||||||
|
class_names = ['Black_Rot', 'ESCA', 'Healthy', 'Leaf_Blight']
|
||||||
|
subsets = ['train', 'val', 'test']
|
||||||
|
|
||||||
|
class_counts = {subset: {class_name: 0 for class_name in class_names} for subset in subsets}
|
||||||
|
|
||||||
|
for subset in subsets:
|
||||||
|
subset_path = os.path.join(data_dir, subset)
|
||||||
|
|
||||||
|
for class_name in class_names:
|
||||||
|
class_path = os.path.join(subset_path, class_name)
|
||||||
|
|
||||||
|
if os.path.isdir(class_path):
|
||||||
|
images = [f for f in os.listdir(class_path)
|
||||||
|
if f.lower().endswith(('.jpg', '.jpeg', '.png'))]
|
||||||
|
class_counts[subset][class_name] = len(images)
|
||||||
|
|
||||||
|
print("=" * 60)
|
||||||
|
print("NB OF CLASSES PER SUBSET")
|
||||||
|
print("=" * 60)
|
||||||
|
|
||||||
|
total_lst = []
|
||||||
|
for idx, subset in enumerate(subsets):
|
||||||
|
print(f"\n{subset.upper()}:")
|
||||||
|
total_lst.append(sum(class_counts[subset].values()))
|
||||||
|
for class_name in class_names:
|
||||||
|
count = class_counts[subset][class_name]
|
||||||
|
percentage = (count / total_lst[idx] * 100) if total_lst[idx] > 0 else 0
|
||||||
|
print(f" {class_name}: {count} images ({percentage:.1f}%)")
|
||||||
|
print(f" Total: {total_lst[idx]} images")
|
||||||
|
|
||||||
|
fig, axes = plt.subplots(1, 3, figsize=(15, 5))
|
||||||
|
|
||||||
|
for idx, subset in enumerate(subsets):
|
||||||
|
counts = [class_counts[subset][class_name] for class_name in class_names]
|
||||||
|
colors = ['#FF6B6B', '#4ECDC4', '#45B7D1', '#FFA07A']
|
||||||
|
|
||||||
|
bars = axes[idx].bar(class_names, counts, color=colors, edgecolor='black', linewidth=1.5)
|
||||||
|
|
||||||
|
for bar in bars:
|
||||||
|
height = bar.get_height()
|
||||||
|
axes[idx].text(bar.get_x() + bar.get_width()/2., height,
|
||||||
|
f'{int(height)}',
|
||||||
|
ha='center', va='bottom', fontweight='bold')
|
||||||
|
|
||||||
|
axes[idx].set_title(subset.upper()+" tot: "+str(total_lst[idx]), fontsize=12, fontweight='bold')
|
||||||
|
axes[idx].set_ylabel('Nombre d\'images', fontsize=10)
|
||||||
|
axes[idx].set_xlabel('Classes', fontsize=10)
|
||||||
|
axes[idx].tick_params(axis='x', rotation=45)
|
||||||
|
axes[idx].grid(axis='y', alpha=0.3, linestyle='--')
|
||||||
|
|
||||||
|
plt.tight_layout()
|
||||||
|
plt.show()
|
||||||
|
|
||||||
80
venv/src/data_pretreat.py
Normal file
80
venv/src/data_pretreat.py
Normal file
|
|
@ -0,0 +1,80 @@
|
||||||
|
import os
|
||||||
|
import matplotlib.pyplot as plt
|
||||||
|
import PIL
|
||||||
|
import tensorflow as tf
|
||||||
|
from tensorflow import keras
|
||||||
|
from tensorflow.keras import layers
|
||||||
|
from tensorflow.keras.models import Sequential
|
||||||
|
|
||||||
|
current_dir = os.getcwd()
|
||||||
|
|
||||||
|
batch_size = 32
|
||||||
|
img_height = 224
|
||||||
|
img_width = 224
|
||||||
|
channels=3
|
||||||
|
epochs=100
|
||||||
|
|
||||||
|
data_dir = current_dir[:-9]+"/data/datasplit/"
|
||||||
|
|
||||||
|
train_ds = tf.keras.utils.image_dataset_from_directory(
|
||||||
|
data_dir+"train/",
|
||||||
|
validation_split=0.2,
|
||||||
|
subset="training",
|
||||||
|
seed=123,
|
||||||
|
image_size=(img_height, img_width),
|
||||||
|
batch_size=batch_size)
|
||||||
|
|
||||||
|
val_ds = tf.keras.utils.image_dataset_from_directory(
|
||||||
|
data_dir+"val/",
|
||||||
|
validation_split=0.2,
|
||||||
|
subset="validation",
|
||||||
|
seed=123,
|
||||||
|
image_size=(img_height, img_width),
|
||||||
|
batch_size=batch_size)
|
||||||
|
|
||||||
|
test_ds = tf.keras.utils.image_dataset_from_directory(
|
||||||
|
data_dir+"test/",
|
||||||
|
seed=123,
|
||||||
|
image_size=(img_height, img_width),
|
||||||
|
batch_size=batch_size)
|
||||||
|
|
||||||
|
class_names = train_ds.class_names
|
||||||
|
print(class_names)
|
||||||
|
|
||||||
|
# Visualize data
|
||||||
|
|
||||||
|
# plt.figure(figsize=(10, 10))
|
||||||
|
# for images, labels in train_ds.take(1):
|
||||||
|
# for i in range(9):
|
||||||
|
# ax = plt.subplot(3, 3, i + 1)
|
||||||
|
# plt.imshow(images[i].numpy().astype("uint8"))
|
||||||
|
# plt.title(class_names[labels[i]])
|
||||||
|
# plt.axis("off")
|
||||||
|
# plt.show()
|
||||||
|
|
||||||
|
#Data augmentation
|
||||||
|
data_augmentation = keras.Sequential(
|
||||||
|
[
|
||||||
|
layers.RandomFlip("horizontal_and_vertical",
|
||||||
|
input_shape=(img_height,
|
||||||
|
img_width,
|
||||||
|
3)),
|
||||||
|
layers.RandomRotation(0.2),
|
||||||
|
layers.RandomZoom(0.1),
|
||||||
|
layers.Rescaling(1./255)
|
||||||
|
]
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# Configure for Performance
|
||||||
|
AUTOTUNE = tf.data.AUTOTUNE
|
||||||
|
train_ds = train_ds.cache().shuffle(1000).prefetch(buffer_size=AUTOTUNE)
|
||||||
|
val_ds = val_ds.cache().prefetch(buffer_size=AUTOTUNE)
|
||||||
|
|
||||||
|
# Pretreatment
|
||||||
|
normalization_layer = layers.Rescaling(1./255)
|
||||||
|
normalized_ds = train_ds.map(lambda x, y: (normalization_layer(x), y))
|
||||||
|
image_batch, labels_batch = next(iter(normalized_ds))
|
||||||
|
first_image = image_batch[0]
|
||||||
|
|
||||||
|
print("\n DONE !")
|
||||||
10
venv/src/data_split.py
Normal file
10
venv/src/data_split.py
Normal file
|
|
@ -0,0 +1,10 @@
|
||||||
|
"""
|
||||||
|
@autour: Jhodi Avizara
|
||||||
|
"""
|
||||||
|
import os
|
||||||
|
import splitfolders
|
||||||
|
|
||||||
|
current_dir = os.getcwd()
|
||||||
|
data_dir = current_dir[:-9]+"/data/raw/"
|
||||||
|
splitfolders.ratio(data_dir, output=current_dir[:-9]+"/data/datasplit/", seed=1337, ratio=(.8, 0.1,0.1))
|
||||||
|
|
||||||
69
venv/src/evaluate_model.py
Normal file
69
venv/src/evaluate_model.py
Normal file
|
|
@ -0,0 +1,69 @@
|
||||||
|
import os
|
||||||
|
import numpy as np
|
||||||
|
import matplotlib.pyplot as plt
|
||||||
|
import pandas as pd
|
||||||
|
import tensorflow as tf
|
||||||
|
from tensorflow.keras.models import load_model
|
||||||
|
from sklearn.metrics import confusion_matrix, classification_report
|
||||||
|
import seaborn as sns
|
||||||
|
|
||||||
|
from load_model import *
|
||||||
|
from data_pretreat import * # src/ function
|
||||||
|
|
||||||
|
model, model_dir = select_model()
|
||||||
|
|
||||||
|
# Load datadframe
|
||||||
|
df = pd.read_csv(model_dir+"/training_history.csv")
|
||||||
|
|
||||||
|
# Visualize training history
|
||||||
|
epochs_range = df.index
|
||||||
|
acc = df['accuracy']
|
||||||
|
val_acc = df['val_accuracy']
|
||||||
|
loss = df['loss']
|
||||||
|
val_loss = df['val_loss']
|
||||||
|
|
||||||
|
# Model testing
|
||||||
|
y_pred = model.predict(test_ds)
|
||||||
|
y_ = np.argmax(y_pred, axis=1)
|
||||||
|
|
||||||
|
y_test_raw = np.concatenate([y for x, y in test_ds], axis=0)
|
||||||
|
|
||||||
|
y_test_classes = y_test_raw
|
||||||
|
|
||||||
|
cm = confusion_matrix(y_test_classes, y_)
|
||||||
|
|
||||||
|
class_names = ['Black_Rot', 'ESCA', 'Healthy', 'Leaf_Blight']
|
||||||
|
|
||||||
|
plt.figure(figsize=(16, 5))
|
||||||
|
|
||||||
|
# Subplot 1 : Training Accuracy
|
||||||
|
plt.subplot(1, 3, 1)
|
||||||
|
plt.plot(epochs_range, acc, label='Training Accuracy')
|
||||||
|
plt.plot(epochs_range, val_acc, label='Validation Accuracy')
|
||||||
|
plt.legend(loc='lower right')
|
||||||
|
plt.title('Training and Validation Accuracy')
|
||||||
|
plt.xlabel('Epoch')
|
||||||
|
plt.ylabel('Accuracy')
|
||||||
|
|
||||||
|
# Subplot 2 : Training Loss
|
||||||
|
plt.subplot(1, 3, 2)
|
||||||
|
plt.plot(epochs_range, loss, label='Training Loss')
|
||||||
|
plt.plot(epochs_range, val_loss, label='Validation Loss')
|
||||||
|
plt.legend(loc='upper right')
|
||||||
|
plt.title('Training and Validation Loss')
|
||||||
|
plt.xlabel('Epoch')
|
||||||
|
plt.ylabel('Loss')
|
||||||
|
|
||||||
|
# Subplot 3 : Confusion Matrix
|
||||||
|
plt.subplot(1, 3, 3)
|
||||||
|
sns.heatmap(cm, annot=True, fmt='d', cmap='Blues',
|
||||||
|
xticklabels=class_names,
|
||||||
|
yticklabels=class_names,
|
||||||
|
cbar=False)
|
||||||
|
plt.title('Matrice de Confusion')
|
||||||
|
plt.ylabel('Vraies étiquettes')
|
||||||
|
plt.xlabel('Prédictions')
|
||||||
|
|
||||||
|
plt.tight_layout()
|
||||||
|
plt.show()
|
||||||
|
|
||||||
0
venv/src/file
Normal file
0
venv/src/file
Normal file
200
venv/src/gradient.py
Normal file
200
venv/src/gradient.py
Normal file
|
|
@ -0,0 +1,200 @@
|
||||||
|
import matplotlib.pylab as plt
|
||||||
|
import numpy as np
|
||||||
|
import tensorflow as tf
|
||||||
|
import os
|
||||||
|
import math
|
||||||
|
from tensorflow.keras.models import load_model
|
||||||
|
|
||||||
|
from load_model import *
|
||||||
|
from data_pretreat import * # src/ function
|
||||||
|
|
||||||
|
model, model_dir = select_model()
|
||||||
|
|
||||||
|
def read_image(file_name):
|
||||||
|
image = tf.io.read_file(file_name)
|
||||||
|
image = tf.io.decode_jpeg(image, channels=channels)
|
||||||
|
image = tf.image.convert_image_dtype(image, tf.float32)
|
||||||
|
image = tf.image.resize_with_pad(image, target_height=img_height, target_width=img_width)
|
||||||
|
return image
|
||||||
|
|
||||||
|
def top_k_predictions(img, k=2):
|
||||||
|
image_batch = tf.expand_dims(img, 0)
|
||||||
|
predictions = model(image_batch)
|
||||||
|
probs = tf.nn.softmax(predictions, axis=-1)
|
||||||
|
top_probs, top_idxs = tf.math.top_k(input=probs, k=k)
|
||||||
|
|
||||||
|
top_labels = [class_names[idx.numpy()] for idx in top_idxs[0]]
|
||||||
|
|
||||||
|
return top_labels, top_probs[0]
|
||||||
|
|
||||||
|
# Load img
|
||||||
|
img_name_tensors = {}
|
||||||
|
|
||||||
|
for images, labels in test_ds:
|
||||||
|
for i, class_name in enumerate(class_names):
|
||||||
|
class_idx = class_names.index(class_name)
|
||||||
|
mask = labels == class_idx
|
||||||
|
|
||||||
|
if tf.reduce_any(mask):
|
||||||
|
img_name_tensors[class_name] = images[mask][0] / 255.0
|
||||||
|
|
||||||
|
|
||||||
|
# Show img with prediction
|
||||||
|
plt.figure(figsize=(14, 12))
|
||||||
|
num_images = len(img_name_tensors)
|
||||||
|
cols = 2
|
||||||
|
rows = math.ceil(num_images / cols)
|
||||||
|
|
||||||
|
for n, (name, img_tensor) in enumerate(img_name_tensors.items()):
|
||||||
|
ax = plt.subplot(rows, cols, n+1)
|
||||||
|
ax.imshow(img_tensor)
|
||||||
|
|
||||||
|
pred_labels, pred_probs = top_k_predictions(img_tensor, k=4)
|
||||||
|
|
||||||
|
pred_text = f"Real classe: {name}\n\nPredictions:\n"
|
||||||
|
for label, prob in zip(pred_labels, pred_probs):
|
||||||
|
pred_text += f"{label}: {prob.numpy():0.1%}\n"
|
||||||
|
|
||||||
|
ax.axis('off')
|
||||||
|
ax.text(-0.5, 0.95, pred_text, ha='left', va='top', transform=ax.transAxes)
|
||||||
|
|
||||||
|
plt.tight_layout()
|
||||||
|
plt.show()
|
||||||
|
|
||||||
|
# Calculate Integrated Gradients
|
||||||
|
def f(x):
|
||||||
|
return tf.where(x < 0.8, x, 0.8) #A simplified model function.
|
||||||
|
|
||||||
|
def interpolated_path(x):
|
||||||
|
return tf.zeros_like(x) #A straight line path.
|
||||||
|
|
||||||
|
x = tf.linspace(start=0.0, stop=1.0, num=6)
|
||||||
|
y = f(x)
|
||||||
|
|
||||||
|
# Establish a baseline
|
||||||
|
baseline = tf.zeros(shape=(224,224,3))
|
||||||
|
|
||||||
|
m_steps=50
|
||||||
|
alphas = tf.linspace(start=0.0, stop=1.0, num=m_steps+1) # Generate m_steps intervals for integral_approximation() below.
|
||||||
|
|
||||||
|
def interpolate_images(baseline,
|
||||||
|
image,
|
||||||
|
alphas):
|
||||||
|
alphas_x = alphas[:, tf.newaxis, tf.newaxis, tf.newaxis]
|
||||||
|
baseline_x = tf.expand_dims(baseline, axis=0)
|
||||||
|
input_x = tf.expand_dims(image, axis=0)
|
||||||
|
delta = input_x - baseline_x
|
||||||
|
images = baseline_x + alphas_x * delta
|
||||||
|
return images
|
||||||
|
|
||||||
|
def compute_gradients(images, target_class_idx):
|
||||||
|
with tf.GradientTape() as tape:
|
||||||
|
tape.watch(images)
|
||||||
|
logits = model(images)
|
||||||
|
probs = tf.nn.softmax(logits, axis=-1)[:, target_class_idx]
|
||||||
|
return tape.gradient(probs, images)
|
||||||
|
|
||||||
|
def integral_approximation(gradients):
|
||||||
|
# riemann_trapezoidal
|
||||||
|
grads = (gradients[:-1] + gradients[1:]) / tf.constant(2.0)
|
||||||
|
integrated_gradients = tf.math.reduce_mean(grads, axis=0)
|
||||||
|
return integrated_gradients
|
||||||
|
|
||||||
|
# Putting it all together
|
||||||
|
def integrated_gradients(baseline,
|
||||||
|
image,
|
||||||
|
target_class_idx,
|
||||||
|
m_steps=50,
|
||||||
|
batch_size=32):
|
||||||
|
# Generate alphas.
|
||||||
|
alphas = tf.linspace(start=0.0, stop=1.0, num=m_steps+1)
|
||||||
|
|
||||||
|
# Collect gradients.
|
||||||
|
gradient_batches = []
|
||||||
|
|
||||||
|
# Iterate alphas range and batch computation for speed, memory efficiency, and scaling to larger m_steps.
|
||||||
|
for alpha in tf.range(0, len(alphas), batch_size):
|
||||||
|
from_ = alpha
|
||||||
|
to = tf.minimum(from_ + batch_size, len(alphas))
|
||||||
|
alpha_batch = alphas[from_:to]
|
||||||
|
|
||||||
|
gradient_batch = one_batch(baseline, image, alpha_batch, target_class_idx)
|
||||||
|
gradient_batches.append(gradient_batch)
|
||||||
|
|
||||||
|
# Concatenate path gradients together row-wise into single tensor.
|
||||||
|
total_gradients = tf.concat(gradient_batches, axis=0)
|
||||||
|
|
||||||
|
# Integral approximation through averaging gradients.
|
||||||
|
avg_gradients = integral_approximation(gradients=total_gradients)
|
||||||
|
|
||||||
|
# Scale integrated gradients with respect to input.
|
||||||
|
integrated_gradients = (image - baseline) * avg_gradients
|
||||||
|
|
||||||
|
return integrated_gradients
|
||||||
|
|
||||||
|
@tf.function
|
||||||
|
def one_batch(baseline, image, alpha_batch, target_class_idx):
|
||||||
|
# Generate interpolated inputs between baseline and input.
|
||||||
|
interpolated_path_input_batch = interpolate_images(baseline=baseline,
|
||||||
|
image=image,
|
||||||
|
alphas=alpha_batch)
|
||||||
|
|
||||||
|
# Compute gradients between model outputs and interpolated inputs.
|
||||||
|
gradient_batch = compute_gradients(images=interpolated_path_input_batch,
|
||||||
|
target_class_idx=target_class_idx)
|
||||||
|
return gradient_batch
|
||||||
|
|
||||||
|
# Visualize attributions
|
||||||
|
|
||||||
|
def plot_img_attributions(baseline,
|
||||||
|
image,
|
||||||
|
target_class_idx,
|
||||||
|
m_steps=50,
|
||||||
|
cmap=None,
|
||||||
|
overlay_alpha=0.4):
|
||||||
|
|
||||||
|
attributions = integrated_gradients(baseline=baseline,
|
||||||
|
image=image,
|
||||||
|
target_class_idx=target_class_idx,
|
||||||
|
m_steps=m_steps)
|
||||||
|
|
||||||
|
# Sum of the attributions across color channels for visualization.
|
||||||
|
# The attribution mask shape is a grayscale image with height and width
|
||||||
|
# equal to the original image.
|
||||||
|
attribution_mask = tf.reduce_sum(tf.math.abs(attributions), axis=-1)
|
||||||
|
|
||||||
|
fig, axs = plt.subplots(nrows=2, ncols=2, squeeze=False, figsize=(8, 8))
|
||||||
|
|
||||||
|
axs[0, 0].set_title('Baseline image')
|
||||||
|
axs[0, 0].imshow(baseline)
|
||||||
|
axs[0, 0].axis('off')
|
||||||
|
|
||||||
|
axs[0, 1].set_title('Original image')
|
||||||
|
axs[0, 1].imshow(image)
|
||||||
|
axs[0, 1].axis('off')
|
||||||
|
|
||||||
|
axs[1, 0].set_title('Attribution mask')
|
||||||
|
axs[1, 0].imshow(attribution_mask, cmap=cmap)
|
||||||
|
axs[1, 0].axis('off')
|
||||||
|
|
||||||
|
axs[1, 1].set_title('Overlay')
|
||||||
|
axs[1, 1].imshow(attribution_mask, cmap=cmap)
|
||||||
|
axs[1, 1].imshow(image, alpha=overlay_alpha)
|
||||||
|
axs[1, 1].axis('off')
|
||||||
|
|
||||||
|
plt.tight_layout()
|
||||||
|
return fig
|
||||||
|
|
||||||
|
_ = plot_img_attributions(image=img_name_tensors['Leaf_Blight'],
|
||||||
|
baseline=baseline,
|
||||||
|
target_class_idx=3,
|
||||||
|
m_steps=240,
|
||||||
|
cmap=plt.cm.inferno,
|
||||||
|
overlay_alpha=0.4)
|
||||||
|
plt.show()
|
||||||
|
|
||||||
|
|
||||||
|
"""
|
||||||
|
@ref :
|
||||||
|
https://www.tensorflow.org/tutorials/interpretability/integrated_gradients?hl=en
|
||||||
|
"""
|
||||||
263
venv/src/gradient.py.bak
Normal file
263
venv/src/gradient.py.bak
Normal file
|
|
@ -0,0 +1,263 @@
|
||||||
|
import matplotlib.pylab as plt
|
||||||
|
import numpy as np
|
||||||
|
import tensorflow as tf
|
||||||
|
import os
|
||||||
|
import math
|
||||||
|
from tensorflow.keras.models import load_model
|
||||||
|
|
||||||
|
from data_pretreat import * # src/ function
|
||||||
|
|
||||||
|
#model = load_model('/home/jhodi/bit/Python/Grapevine_Pathology_Detection/venv/models/2026-03-14_21:21:01.562003/model.keras')
|
||||||
|
model_dir = input("Model dir : ")
|
||||||
|
model = load_model(model_dir+"/model.keras")
|
||||||
|
|
||||||
|
model.build([None, 224, 224, 3])
|
||||||
|
model.summary()
|
||||||
|
|
||||||
|
# img_height = 224
|
||||||
|
# img_width = 224
|
||||||
|
# batch_size = 32
|
||||||
|
#
|
||||||
|
# test_dir = os.getcwd()[:-9]+"/data/datasplit/test"
|
||||||
|
# class_names = ['Black_Rot', 'ESCA', 'Healthy', 'Leaf_Blight']
|
||||||
|
|
||||||
|
def read_image(file_name):
|
||||||
|
image = tf.io.read_file(file_name)
|
||||||
|
image = tf.io.decode_jpeg(image, channels=channels)
|
||||||
|
image = tf.image.convert_image_dtype(image, tf.float32)
|
||||||
|
image = tf.image.resize_with_pad(image, target_height=img_height, target_width=img_width)
|
||||||
|
return image
|
||||||
|
|
||||||
|
def top_k_predictions(img, k=2):
|
||||||
|
image_batch = tf.expand_dims(img, 0)
|
||||||
|
predictions = model(image_batch)
|
||||||
|
probs = tf.nn.softmax(predictions, axis=-1)
|
||||||
|
top_probs, top_idxs = tf.math.top_k(input=probs, k=k)
|
||||||
|
|
||||||
|
top_labels = [class_names[idx.numpy()] for idx in top_idxs[0]]
|
||||||
|
|
||||||
|
return top_labels, top_probs[0]
|
||||||
|
|
||||||
|
# Load img
|
||||||
|
img_name_tensors = {}
|
||||||
|
|
||||||
|
for images, labels in test_ds:
|
||||||
|
for i, class_name in enumerate(class_names):
|
||||||
|
class_idx = class_names.index(class_name)
|
||||||
|
mask = labels == class_idx
|
||||||
|
|
||||||
|
if tf.reduce_any(mask):
|
||||||
|
img_name_tensors[class_name] = images[mask][0] / 255.0
|
||||||
|
|
||||||
|
|
||||||
|
# Show img with prediction
|
||||||
|
# plt.figure(figsize=(14, 12))
|
||||||
|
# num_images = len(img_name_tensors)
|
||||||
|
# cols = 2
|
||||||
|
# rows = math.ceil(num_images / cols)
|
||||||
|
#
|
||||||
|
# for n, (name, img_tensor) in enumerate(img_name_tensors.items()):
|
||||||
|
# ax = plt.subplot(rows, cols, n+1)
|
||||||
|
# ax.imshow(img_tensor)
|
||||||
|
#
|
||||||
|
# pred_labels, pred_probs = top_k_predictions(img_tensor, k=2)
|
||||||
|
#
|
||||||
|
# pred_text = f"Real classe: {name}\n\nPrédictions:\n"
|
||||||
|
# for label, prob in zip(pred_labels, pred_probs):
|
||||||
|
# pred_text += f"{label}: {prob.numpy():0.1%}\n"
|
||||||
|
#
|
||||||
|
# ax.set_title(pred_text, fontsize=10, fontweight='bold')
|
||||||
|
# ax.axis('off')
|
||||||
|
#
|
||||||
|
# plt.tight_layout()
|
||||||
|
# plt.show()
|
||||||
|
|
||||||
|
# Calculate Integrated Gradients
|
||||||
|
|
||||||
|
def f(x):
|
||||||
|
#A simplified model function.
|
||||||
|
return tf.where(x < 0.8, x, 0.8)
|
||||||
|
|
||||||
|
def interpolated_path(x):
|
||||||
|
#A straight line path.
|
||||||
|
return tf.zeros_like(x)
|
||||||
|
|
||||||
|
x = tf.linspace(start=0.0, stop=1.0, num=6)
|
||||||
|
y = f(x)
|
||||||
|
|
||||||
|
# Establish a baseline
|
||||||
|
baseline = tf.zeros(shape=(224,224,3))
|
||||||
|
# plt.imshow(baseline)
|
||||||
|
# plt.title("Baseline")
|
||||||
|
# plt.axis('off')
|
||||||
|
# plt.show()
|
||||||
|
|
||||||
|
m_steps=50
|
||||||
|
alphas = tf.linspace(start=0.0, stop=1.0, num=m_steps+1) # Generate m_steps intervals for integral_approximation() below.
|
||||||
|
|
||||||
|
def interpolate_images(baseline,
|
||||||
|
image,
|
||||||
|
alphas):
|
||||||
|
alphas_x = alphas[:, tf.newaxis, tf.newaxis, tf.newaxis]
|
||||||
|
baseline_x = tf.expand_dims(baseline, axis=0)
|
||||||
|
input_x = tf.expand_dims(image, axis=0)
|
||||||
|
delta = input_x - baseline_x
|
||||||
|
images = baseline_x + alphas_x * delta
|
||||||
|
return images
|
||||||
|
|
||||||
|
interpolated_images = interpolate_images(
|
||||||
|
baseline=baseline,
|
||||||
|
image=img_name_tensors['Leaf_Blight'], # class index : 3
|
||||||
|
alphas=alphas)
|
||||||
|
|
||||||
|
# fig = plt.figure(figsize=(20, 20))
|
||||||
|
#
|
||||||
|
# i = 0
|
||||||
|
# for alpha, image in zip(alphas[0::10], interpolated_images[0::10]):
|
||||||
|
# i += 1
|
||||||
|
# plt.subplot(1, len(alphas[0::10]), i)
|
||||||
|
# plt.title(f'alpha: {alpha:.1f}')
|
||||||
|
# plt.imshow(image)
|
||||||
|
# plt.axis('off')
|
||||||
|
#
|
||||||
|
# plt.tight_layout();
|
||||||
|
|
||||||
|
def compute_gradients(images, target_class_idx):
|
||||||
|
with tf.GradientTape() as tape:
|
||||||
|
tape.watch(images)
|
||||||
|
logits = model(images)
|
||||||
|
probs = tf.nn.softmax(logits, axis=-1)[:, target_class_idx]
|
||||||
|
return tape.gradient(probs, images)
|
||||||
|
|
||||||
|
path_gradients = compute_gradients(
|
||||||
|
images=interpolated_images,
|
||||||
|
target_class_idx=3)
|
||||||
|
|
||||||
|
print(path_gradients.shape)
|
||||||
|
|
||||||
|
pred = model(interpolated_images)
|
||||||
|
pred_proba = tf.nn.softmax(pred, axis=-1)[:, 3]
|
||||||
|
|
||||||
|
def integral_approximation(gradients):
|
||||||
|
# riemann_trapezoidal
|
||||||
|
grads = (gradients[:-1] + gradients[1:]) / tf.constant(2.0)
|
||||||
|
integrated_gradients = tf.math.reduce_mean(grads, axis=0)
|
||||||
|
return integrated_gradients
|
||||||
|
|
||||||
|
ig = integral_approximation(
|
||||||
|
gradients=path_gradients)
|
||||||
|
|
||||||
|
print(ig.shape)
|
||||||
|
|
||||||
|
# Putting it all together
|
||||||
|
def integrated_gradients(baseline,
|
||||||
|
image,
|
||||||
|
target_class_idx,
|
||||||
|
m_steps=50,
|
||||||
|
batch_size=32):
|
||||||
|
# Generate alphas.
|
||||||
|
alphas = tf.linspace(start=0.0, stop=1.0, num=m_steps+1)
|
||||||
|
|
||||||
|
# Collect gradients.
|
||||||
|
gradient_batches = []
|
||||||
|
|
||||||
|
# Iterate alphas range and batch computation for speed, memory efficiency, and scaling to larger m_steps.
|
||||||
|
for alpha in tf.range(0, len(alphas), batch_size):
|
||||||
|
from_ = alpha
|
||||||
|
to = tf.minimum(from_ + batch_size, len(alphas))
|
||||||
|
alpha_batch = alphas[from_:to]
|
||||||
|
|
||||||
|
gradient_batch = one_batch(baseline, image, alpha_batch, target_class_idx)
|
||||||
|
gradient_batches.append(gradient_batch)
|
||||||
|
|
||||||
|
# Concatenate path gradients together row-wise into single tensor.
|
||||||
|
total_gradients = tf.concat(gradient_batches, axis=0)
|
||||||
|
|
||||||
|
# Integral approximation through averaging gradients.
|
||||||
|
avg_gradients = integral_approximation(gradients=total_gradients)
|
||||||
|
|
||||||
|
# Scale integrated gradients with respect to input.
|
||||||
|
integrated_gradients = (image - baseline) * avg_gradients
|
||||||
|
|
||||||
|
return integrated_gradients
|
||||||
|
|
||||||
|
@tf.function
|
||||||
|
def one_batch(baseline, image, alpha_batch, target_class_idx):
|
||||||
|
# Generate interpolated inputs between baseline and input.
|
||||||
|
interpolated_path_input_batch = interpolate_images(baseline=baseline,
|
||||||
|
image=image,
|
||||||
|
alphas=alpha_batch)
|
||||||
|
|
||||||
|
# Compute gradients between model outputs and interpolated inputs.
|
||||||
|
gradient_batch = compute_gradients(images=interpolated_path_input_batch,
|
||||||
|
target_class_idx=target_class_idx)
|
||||||
|
return gradient_batch
|
||||||
|
|
||||||
|
ig_attributions = integrated_gradients(baseline=baseline,
|
||||||
|
image=img_name_tensors['Leaf_Blight'],
|
||||||
|
target_class_idx=3,
|
||||||
|
m_steps=240)
|
||||||
|
|
||||||
|
print(ig_attributions.shape)
|
||||||
|
|
||||||
|
# Visualize attributions
|
||||||
|
|
||||||
|
def plot_img_attributions(baseline,
|
||||||
|
image,
|
||||||
|
target_class_idx,
|
||||||
|
m_steps=50,
|
||||||
|
cmap=None,
|
||||||
|
overlay_alpha=0.4):
|
||||||
|
|
||||||
|
attributions = integrated_gradients(baseline=baseline,
|
||||||
|
image=image,
|
||||||
|
target_class_idx=target_class_idx,
|
||||||
|
m_steps=m_steps)
|
||||||
|
|
||||||
|
# Sum of the attributions across color channels for visualization.
|
||||||
|
# The attribution mask shape is a grayscale image with height and width
|
||||||
|
# equal to the original image.
|
||||||
|
attribution_mask = tf.reduce_sum(tf.math.abs(attributions), axis=-1)
|
||||||
|
|
||||||
|
fig, axs = plt.subplots(nrows=2, ncols=2, squeeze=False, figsize=(8, 8))
|
||||||
|
|
||||||
|
axs[0, 0].set_title('Baseline image')
|
||||||
|
axs[0, 0].imshow(baseline)
|
||||||
|
axs[0, 0].axis('off')
|
||||||
|
|
||||||
|
axs[0, 1].set_title('Original image')
|
||||||
|
axs[0, 1].imshow(image)
|
||||||
|
axs[0, 1].axis('off')
|
||||||
|
|
||||||
|
axs[1, 0].set_title('Attribution mask')
|
||||||
|
axs[1, 0].imshow(attribution_mask, cmap=cmap)
|
||||||
|
axs[1, 0].axis('off')
|
||||||
|
|
||||||
|
axs[1, 1].set_title('Overlay')
|
||||||
|
axs[1, 1].imshow(attribution_mask, cmap=cmap)
|
||||||
|
axs[1, 1].imshow(image, alpha=overlay_alpha)
|
||||||
|
axs[1, 1].axis('off')
|
||||||
|
|
||||||
|
plt.tight_layout()
|
||||||
|
return fig
|
||||||
|
|
||||||
|
_ = plot_img_attributions(image=img_name_tensors['Leaf_Blight'],
|
||||||
|
baseline=baseline,
|
||||||
|
target_class_idx=3,
|
||||||
|
m_steps=240,
|
||||||
|
cmap=plt.cm.inferno,
|
||||||
|
overlay_alpha=0.4)
|
||||||
|
plt.show()
|
||||||
|
|
||||||
|
_ = plot_img_attributions(image=img_name_tensors['ESCA'],
|
||||||
|
baseline=baseline,
|
||||||
|
target_class_idx=1,
|
||||||
|
m_steps=55,
|
||||||
|
cmap=plt.cm.viridis,
|
||||||
|
overlay_alpha=0.5)
|
||||||
|
plt.show()
|
||||||
|
|
||||||
|
"""
|
||||||
|
@ref :
|
||||||
|
https://www.tensorflow.org/tutorials/interpretability/integrated_gradients?hl=en
|
||||||
|
"""
|
||||||
46
venv/src/load_model.py
Normal file
46
venv/src/load_model.py
Normal file
|
|
@ -0,0 +1,46 @@
|
||||||
|
import os
|
||||||
|
from tensorflow.keras.models import load_model
|
||||||
|
from data_pretreat import img_height, img_width, channels
|
||||||
|
import sys
|
||||||
|
|
||||||
|
def select_model():
|
||||||
|
# all_model_dir = "/home/jhodi/bit/Python/Grapevine_Pathology_Detection/venv/models"
|
||||||
|
# Verify if a model is present on all_model_dir
|
||||||
|
while True:
|
||||||
|
try:
|
||||||
|
all_model_dir = input("Model dir : ")
|
||||||
|
model_found = 0
|
||||||
|
for foldername, subfolders, filenames in os.walk(all_model_dir):
|
||||||
|
for filename in filenames:
|
||||||
|
if filename.endswith(".keras"):
|
||||||
|
model_found += 1
|
||||||
|
if model_found == 0:
|
||||||
|
print("No model found ! ")
|
||||||
|
else:
|
||||||
|
print("Models found.")
|
||||||
|
break
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Something went wrong! {str(e)}")
|
||||||
|
|
||||||
|
subdirectories = [name for name in os.listdir(all_model_dir) if os.path.isdir(os.path.join(all_model_dir, name))]
|
||||||
|
print("Select a model:")
|
||||||
|
for idx, dir_ in enumerate(subdirectories):
|
||||||
|
print(f"({idx})\t{dir_}")
|
||||||
|
|
||||||
|
while True:
|
||||||
|
try:
|
||||||
|
selected_model = int(input("-> "))
|
||||||
|
if 0 <= selected_model < len(subdirectories):
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
print("Invalid choice. Please choose a valid number.")
|
||||||
|
except ValueError:
|
||||||
|
print("That's not a valid number!")
|
||||||
|
|
||||||
|
model_dir = os.path.join(all_model_dir, subdirectories[selected_model])
|
||||||
|
|
||||||
|
model = load_model(os.path.join(model_dir, "model.keras" ))
|
||||||
|
model.build([None, img_height, img_width, channels])
|
||||||
|
model.summary()
|
||||||
|
|
||||||
|
return model, model_dir
|
||||||
163
venv/src/main.py.bak
Executable file
163
venv/src/main.py.bak
Executable file
|
|
@ -0,0 +1,163 @@
|
||||||
|
"""
|
||||||
|
@autour: Jhodi Avizara
|
||||||
|
"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
import datetime
|
||||||
|
import matplotlib.pyplot as plt
|
||||||
|
import numpy as np
|
||||||
|
import PIL
|
||||||
|
import tensorflow as tf
|
||||||
|
from tensorflow import keras
|
||||||
|
from tensorflow.keras import layers
|
||||||
|
from tensorflow.keras.models import Sequential
|
||||||
|
|
||||||
|
|
||||||
|
print("\nNum GPUs Available: ", len(tf.config.list_physical_devices('GPU')), "\n")
|
||||||
|
#os.environ['CUDA_VISIBLE_DEVICES'] = '-1' # bypass GPU issues
|
||||||
|
current_dir = os.getcwd()
|
||||||
|
|
||||||
|
batch_size = 32
|
||||||
|
img_height = 224
|
||||||
|
img_width = 224
|
||||||
|
epochs=100
|
||||||
|
|
||||||
|
data_dir = current_dir[:-9]+"/data/"
|
||||||
|
|
||||||
|
train_ds = tf.keras.utils.image_dataset_from_directory(
|
||||||
|
data_dir,
|
||||||
|
validation_split=0.2,
|
||||||
|
subset="training",
|
||||||
|
seed=123,
|
||||||
|
image_size=(img_height, img_width),
|
||||||
|
batch_size=batch_size)
|
||||||
|
|
||||||
|
val_ds = tf.keras.utils.image_dataset_from_directory(
|
||||||
|
data_dir,
|
||||||
|
validation_split=0.2,
|
||||||
|
subset="validation",
|
||||||
|
seed=123,
|
||||||
|
image_size=(img_height, img_width),
|
||||||
|
batch_size=batch_size)
|
||||||
|
|
||||||
|
class_names = train_ds.class_names
|
||||||
|
print(class_names)
|
||||||
|
|
||||||
|
# Visualize data
|
||||||
|
|
||||||
|
#plt.figure(figsize=(10, 10))
|
||||||
|
#for images, labels in train_ds.take(1):
|
||||||
|
# for i in range(9):
|
||||||
|
# ax = plt.subplot(3, 3, i + 1)
|
||||||
|
# plt.imshow(images[i].numpy().astype("uint8"))
|
||||||
|
# plt.title(class_names[labels[i]])
|
||||||
|
# plt.axis("off")
|
||||||
|
#plt.show()
|
||||||
|
|
||||||
|
#Data augmentation
|
||||||
|
data_augmentation = keras.Sequential(
|
||||||
|
[
|
||||||
|
layers.RandomFlip("horizontal_and_vertical",
|
||||||
|
input_shape=(img_height,
|
||||||
|
img_width,
|
||||||
|
3)),
|
||||||
|
layers.RandomRotation(0.2),
|
||||||
|
layers.RandomZoom(0.1),
|
||||||
|
layers.Rescaling(1./255)
|
||||||
|
]
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# Configure for Performance
|
||||||
|
AUTOTUNE = tf.data.AUTOTUNE
|
||||||
|
train_ds = train_ds.cache().shuffle(1000).prefetch(buffer_size=AUTOTUNE)
|
||||||
|
val_ds = val_ds.cache().prefetch(buffer_size=AUTOTUNE)
|
||||||
|
|
||||||
|
# Pretreatment
|
||||||
|
normalization_layer = layers.Rescaling(1./255)
|
||||||
|
normalized_ds = train_ds.map(lambda x, y: (normalization_layer(x), y))
|
||||||
|
image_batch, labels_batch = next(iter(normalized_ds))
|
||||||
|
first_image = image_batch[0]
|
||||||
|
|
||||||
|
# Create a model
|
||||||
|
num_classes = len(class_names)
|
||||||
|
|
||||||
|
model = Sequential([
|
||||||
|
data_augmentation,
|
||||||
|
layers.Rescaling(1./255, input_shape=(img_height, img_width, 3)),
|
||||||
|
layers.Conv2D(16, 3, padding='same', activation='relu'),
|
||||||
|
layers.MaxPooling2D(),
|
||||||
|
layers.Conv2D(32, 3, padding='same', activation='relu'),
|
||||||
|
layers.MaxPooling2D(),
|
||||||
|
layers.Conv2D(64, 3, padding='same', activation='relu'),
|
||||||
|
layers.MaxPooling2D(),
|
||||||
|
layers.Flatten(),
|
||||||
|
layers.Dense(128, activation='relu'),
|
||||||
|
layers.Dense(num_classes)
|
||||||
|
])
|
||||||
|
|
||||||
|
model.compile(optimizer='adam',
|
||||||
|
loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
|
||||||
|
metrics=['accuracy'])
|
||||||
|
|
||||||
|
|
||||||
|
model.summary()
|
||||||
|
|
||||||
|
# Training
|
||||||
|
time = datetime.datetime.now() # Time checkpoint
|
||||||
|
|
||||||
|
epochs=epochs
|
||||||
|
history = model.fit(
|
||||||
|
normalized_ds,
|
||||||
|
validation_data= val_ds,
|
||||||
|
epochs= epochs,
|
||||||
|
steps_per_epoch= 10
|
||||||
|
)
|
||||||
|
|
||||||
|
# Visualize results
|
||||||
|
acc = history.history['accuracy']
|
||||||
|
val_acc = history.history['val_accuracy']
|
||||||
|
|
||||||
|
loss = history.history['loss']
|
||||||
|
val_loss = history.history['val_loss']
|
||||||
|
|
||||||
|
epochs_range = range(epochs)
|
||||||
|
|
||||||
|
"""
|
||||||
|
## Validation Acc
|
||||||
|
plt.figure(figsize=(8, 8))
|
||||||
|
plt.subplot(1, 2, 1)
|
||||||
|
plt.plot(epochs_range, acc, label='Training Accuracy')
|
||||||
|
plt.plot(epochs_range, val_acc, label='Validation Accuracy')
|
||||||
|
plt.legend(loc='lower right')
|
||||||
|
plt.title('Training and Validation Accuracy')
|
||||||
|
|
||||||
|
## Validation Loss
|
||||||
|
plt.subplot(1, 2, 2)
|
||||||
|
plt.plot(epochs_range, loss, label='Training Loss')
|
||||||
|
plt.plot(epochs_range, val_loss, label='Validation Loss')
|
||||||
|
plt.legend(loc='upper right')
|
||||||
|
plt.title('Training and Validation Loss')
|
||||||
|
plt.show()
|
||||||
|
"""
|
||||||
|
|
||||||
|
# save model
|
||||||
|
model.save(current_dir[:-4]+"/models/"+str(time)+".keras")
|
||||||
|
|
||||||
|
|
||||||
|
# Convert the model.
|
||||||
|
converter = tf.lite.TFLiteConverter.from_keras_model(model)
|
||||||
|
tflite_model = converter.convert()
|
||||||
|
|
||||||
|
with open(current_dir[:-4]+"/models/model"+str(time)+".tflite", 'wb') as f:
|
||||||
|
f.write(tflite_model) # Save the model.
|
||||||
|
|
||||||
|
# Attribution Mask
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
"""
|
||||||
|
@references:
|
||||||
|
https://www.tensorflow.org/tutorials/images/classification?hl=fr
|
||||||
|
https://www.tensorflow.org/lite/convert?hl=fr
|
||||||
|
"""
|
||||||
94
venv/src/model_train.py
Normal file
94
venv/src/model_train.py
Normal file
|
|
@ -0,0 +1,94 @@
|
||||||
|
from data_pretreat import *
|
||||||
|
import datetime
|
||||||
|
import os
|
||||||
|
import pandas as pd
|
||||||
|
|
||||||
|
# Create a model
|
||||||
|
num_classes = len(class_names)
|
||||||
|
|
||||||
|
model = Sequential([
|
||||||
|
data_augmentation,
|
||||||
|
|
||||||
|
# Block 1
|
||||||
|
layers.Conv2D(32, kernel_size=3, padding='same', activation='relu'),
|
||||||
|
layers.BatchNormalization(),
|
||||||
|
layers.Conv2D(32, kernel_size=3, padding='same', activation='relu'),
|
||||||
|
layers.BatchNormalization(),
|
||||||
|
layers.MaxPooling2D(pool_size=2),
|
||||||
|
layers.Dropout(0.25),
|
||||||
|
|
||||||
|
# Block 2
|
||||||
|
layers.Conv2D(64, kernel_size=3, padding='same', activation='relu'),
|
||||||
|
layers.BatchNormalization(),
|
||||||
|
layers.Conv2D(64, kernel_size=3, padding='same', activation='relu'),
|
||||||
|
layers.BatchNormalization(),
|
||||||
|
layers.MaxPooling2D(pool_size=2),
|
||||||
|
layers.Dropout(0.25),
|
||||||
|
|
||||||
|
# Block 3
|
||||||
|
layers.Conv2D(128, kernel_size=3, padding='same', activation='relu'),
|
||||||
|
layers.BatchNormalization(),
|
||||||
|
layers.Conv2D(128, kernel_size=3, padding='same', activation='relu'),
|
||||||
|
layers.BatchNormalization(),
|
||||||
|
layers.MaxPooling2D(pool_size=2),
|
||||||
|
layers.Dropout(0.25),
|
||||||
|
|
||||||
|
# Block 4
|
||||||
|
layers.Conv2D(256, kernel_size=3, padding='same', activation='relu'),
|
||||||
|
layers.BatchNormalization(),
|
||||||
|
layers.Conv2D(256, kernel_size=3, padding='same', activation='relu'),
|
||||||
|
layers.BatchNormalization(),
|
||||||
|
layers.MaxPooling2D(pool_size=2),
|
||||||
|
layers.Dropout(0.25),
|
||||||
|
|
||||||
|
# Classification head
|
||||||
|
layers.GlobalAveragePooling2D(),
|
||||||
|
layers.Dense(256, activation='relu'),
|
||||||
|
layers.BatchNormalization(),
|
||||||
|
layers.Dropout(0.5),
|
||||||
|
layers.Dense(128, activation='relu'),
|
||||||
|
layers.BatchNormalization(),
|
||||||
|
layers.Dropout(0.5),
|
||||||
|
layers.Dense(num_classes)
|
||||||
|
])
|
||||||
|
|
||||||
|
optimizer = tf.keras.optimizers.Adam(learning_rate=0.001)
|
||||||
|
|
||||||
|
model.compile(optimizer=optimizer,
|
||||||
|
loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
|
||||||
|
metrics=['accuracy'])
|
||||||
|
|
||||||
|
model.summary()
|
||||||
|
|
||||||
|
# Training
|
||||||
|
time = datetime.datetime.now() # Time checkpoint
|
||||||
|
time = str(time).replace(" ", "_")
|
||||||
|
|
||||||
|
epochs=epochs
|
||||||
|
history = model.fit(
|
||||||
|
normalized_ds,
|
||||||
|
validation_data= val_ds,
|
||||||
|
epochs= epochs #,steps_per_epoch= 10
|
||||||
|
)
|
||||||
|
|
||||||
|
# Export history as csv
|
||||||
|
new_path=current_dir[:-4]+"/models/"+str(time)
|
||||||
|
os.makedirs(new_path)
|
||||||
|
|
||||||
|
df = pd.DataFrame({
|
||||||
|
'epoch': range(1, epochs+1),
|
||||||
|
'accuracy': history.history['accuracy'],
|
||||||
|
'val_accuracy': history.history['val_accuracy'],
|
||||||
|
'loss': history.history['loss'],
|
||||||
|
'val_loss': history.history['val_loss']
|
||||||
|
})
|
||||||
|
df.to_csv(new_path+"/training_history.csv", index=False)
|
||||||
|
|
||||||
|
# save model
|
||||||
|
model.save(new_path+"/model.keras")
|
||||||
|
|
||||||
|
# Convert the model.
|
||||||
|
converter = tf.lite.TFLiteConverter.from_keras_model(model)
|
||||||
|
tflite_model = converter.convert()
|
||||||
|
with open(new_path+"/model.tflite", 'wb') as f:
|
||||||
|
f.write(tflite_model)
|
||||||
Loading…
Reference in a new issue