Skip to content

Confusion Matrix weird results? #1665

@a-esp-1

Description

@a-esp-1

❔Question

Hi,

I've seen you have update the repo (#1474) and add the Confusion Matrix. When I run it with my own dataset I get this Confusion Matrix:
confusion_matrix

My mAP is 0.84 for this dataset, and for example for Airplane class, AP = 0.95.
ap

So I can't understand how is possible that in the Confusion Matrix [row=Airplane, column=Airplane] = 0.26, [row=Airplane, column=FN] = 0 and [row=FP, column=Airplane] = 0.73.

As I can understand this means that just 26% of detections are correct (TP), but it's not possible because AP(Airplane) = 0.95 and when I run detections Airplanes are well detected.

Additional context

I'm getting this weird result not just with this dataset, if not with other 2.

Metadata

Metadata

Assignees

No one assigned

    Labels

    StaleStale and schedule for closing soonquestionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions