I am training a convolutional neural network model, but the confusion matrix always comes out the same way:
Although all the data is not always accumulated in the first column, it can also happen with any of the others, but the data accumulated in a column always appears.
The classes are unbalanced, for this, I have tried different methods in the fit generator:
weight_dict={
0:1-0.73636,
1:1-0.069197,
2:1-0.15062,
3:1-0.02437,
4:1-0.01943
}
class_weights1 = class_weight.compute_class_weight('balanced',
np.unique(validation_generator.classes),
validation_generator.classes)
counter = Counter(train_generator.classes)
max_val = float(max(counter.values()))
class_weights2 = {class_id : max_val/num_images for class_id, num_images in counter.items()}
I think this is something that has to do precisely with the imbalance of classes. Has something like this ever happened to you? Do you know what could be the reason?
What happens is that your neural network only predicts one class, so in the confusion matrix you get all the data in one column. In this example, your neural network predicts that all data belongs to class 1 .
This happens when your neural network is not able to learn and identify patterns that differentiate some classes from others.
The reasons why they can happen are very diverse, and a complete and long analysis of the problem would be needed. Some can be:
etc..