-
-
Notifications
You must be signed in to change notification settings - Fork 17.2k
Closed
Labels
StaleStale and schedule for closing soonStale and schedule for closing soonquestionFurther information is requestedFurther information is requested
Description
❔Question
Hi,
I've seen you have update the repo (#1474) and add the Confusion Matrix. When I run it with my own dataset I get this Confusion Matrix:
My mAP is 0.84 for this dataset, and for example for Airplane class, AP = 0.95.
So I can't understand how is possible that in the Confusion Matrix [row=Airplane, column=Airplane] = 0.26, [row=Airplane, column=FN] = 0 and [row=FP, column=Airplane] = 0.73.
As I can understand this means that just 26% of detections are correct (TP), but it's not possible because AP(Airplane) = 0.95 and when I run detections Airplanes are well detected.
Additional context
I'm getting this weird result not just with this dataset, if not with other 2.
Metadata
Metadata
Assignees
Labels
StaleStale and schedule for closing soonStale and schedule for closing soonquestionFurther information is requestedFurther information is requested