Skip to content
This repository was archived by the owner on Nov 17, 2023. It is now read-only.
This repository was archived by the owner on Nov 17, 2023. It is now read-only.

gluon.SymbolBlock cannot imports resnet trained with dtype="float16" #11849

@woozch

Description

@woozch

Description

Cannot load fine-tuned resnet101 (incubator-mxnet/example/image-classification/symbols/resnet.py) with dtype="float16" with "gluon.SymbolBlock.imports" method.

Error Message:

AssertionError: Failed loading Parameter 'stage3_unit2_conv2_weight' from saved params: dtype incompatible expected <type 'numpy.float32'> vs saved <type 'numpy.float16'>

Minimum reproducible example

net = gluon.SymbolBlock.imports('resnet-101-symbol.json',['data','softmax_label'],'resnet-101-0007.params')  # This line gives the error message.
net = gluon.SymbolBlock.imports('resnet-101-symbol.json',['data','softmax_label']) 
print(net.collect_params())

My Questions

In incubator-mxnet/example/image-classification/symbols/resnet.py,
there is mx.sym.Cast for type conversion.
I fine-tuned resnet101 with dtype="float16", and I need to load this model as HybridBlock, However, the method gluon.SymbolBlock.imports makes every params' type in the network as float32. Therefore, the trained model cannot be updated.

Here, resnet-101-0007.params are trained with argument dtype='float16'
In resnet-101-symbol.json file, there is the Cast op.
{
"op": "Cast",
"name": "cast0",
"attrs": {"dtype": "float16"},
"inputs": [[7, 0, 0]]
},
It seems that gluon.SymbolBlock.imports does not consider the type conversion operator.

For now, I think I need to load all parameter manualy, and change types then save.
Is there any other solution to solve this problem?

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions