Skip to content

About not being able to customize the anchors #7015

@jayer95

Description

@jayer95

Search before asking

Question

I'm training a model for license plate character detection, this is the training command I use,

python train.py
--img 160
--batch 256
--epochs 300
--data data/number_plate_00_02_08_09_18_19_all_mixing.yaml
--weights ''
--cfg models/yolov5n-number_plate_00_02_08_09_18_19_all_mixing_160.yaml
--hyp data/hyps/hyp.scratch-low-number_plate_00_02_08_09_18_19_all_mixing.yaml

When the training is completed, I found that its anchors use the default anchors, and there is no automatic recalculation.

import torch
from models.experimental import attempt_load

model = attempt_load('./yolov5n-number_plate_00_02_08_09_18_19_all_mixing_best_0317_160x160.pt', map_location=torch.device('cpu'))
m = model.module.model[-1] if hasattr(model, 'module') else model.model[-1]
print(m.anchor_grid)

anchors:

  • [10,13, 16,30, 33,23] # P3/8
  • [30,61,62,45,59,119] # P4/16
  • [116,90, 156,198, 373,326] # P5/32

I don't think it's reasonable, this model is very bad at detecting small objects,

Since there are no autoanchors during training, I recalculate the anchors on my dataset,

import utils.autoanchor as autoAC

new_anchors = autoAC.kmean_anchors('./data/number_plate_00_02_08_09_18_19_all_mixing.yaml', 9, 160, 5.0, 5000, True)
print(new_anchors)

Get the following new anchors,

  • [17,21, 19,35, 18,40] # P3/8
  • [22,36, 21,40, 25,38] # P4/16
  • [24,42, 28,40, 29,50] # P5/32

I go to yolov5/models/yolov5n.yaml,

anchors:

  • [10,13, 16,30, 33,23] # P3/8
  • [30,61,62,45,59,119] # P4/16
  • [116,90, 156,198, 373,326] # P5/32

replace with

  • [17,21, 19,35, 18,40] # P3/8
  • [22,36, 21,40, 25,38] # P4/16
  • [24,42, 28,40, 29,50] # P5/32

and retrain,

python train.py
--img 160
--batch 256
--epochs 300
--data data/number_plate_00_02_08_09_18_19_all_mixing.yaml
--weights ''
--cfg models/yolov5n-number_plate_00_02_08_09_18_19_all_mixing_160.yaml
--hyp data/hyps/hyp.scratch-low-number_plate_00_02_08_09_18_19_all_mixing.yaml
--noautoanchor

After the training, I re-checked best.pt/last.pt and got:

  • [6,11, 7,10, 7,13] # P3/8
  • [22,36, 21,40, 25,38] # P4/16
  • [68,84, 76,140, ​​72,160] # P5/32

I used "--noautoanchor" during training, but the model was not constructed based on the anchors I specified? The model also recomputed a new set of anchors?

I refer to #6966

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingquestionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions