Skip to content

Can't Convert tensorflow saved_model to frozen inference graph  #8966

@Deep-learning-practitioner

Description

System information (version)

Using Google colab
TF version = 2.2

I have fine tuned an SSD MobileNet V2 model in tensorflow Objecet Detection API 2. Now I'm trying to convert the saved_model to frozen inference graph so I can use it in OpenCV DNN module but I'm really confused as how to proceed.

I Have been following this tutorial, but how to modify this for mobilenet v2 trained with TFOD API v2.

The saved model I'm trying to convert can be loaded from zoo via this code:
You can run this in colab

!wget 'http://download.tensorflow.org/models/object_detection/tf2/20200711/ssd_mobilenet_v2_320x320_coco17_tpu-8.tar.gz'
!tar -xf ssd_mobilenet_v2_320x320_coco17_tpu-8.tar.gz
model_path = 'ssd_mobilenet_v2_320x320_coco17_tpu-8/saved_model/saved_model.pb'


import tensorflow as tf
loaded = tf.saved_model.load('ssd_mobilenet_v2_320x320_coco17_tpu-8/saved_model') 

Now how can I convert this model to frozen inference graph?

While following the above tutorial and running this code:

full_model = tf.function(lambda x: model(x))

full_model = full_model.get_concrete_function(tf.TensorSpec(loaded.inputs[0].shape, loaded.inputs[0].dtype))

I'm getting this error:

      1 full_model = tf.function(lambda x: model(x))
----> 2 full_model = full_model.get_concrete_function(tf.TensorSpec(loaded.inputs[0].shape, loaded.inputs[0].dtype))

AttributeError: '_UserObject' object has no attribute 'inputs'

Thank you, any help would be appreciated

Metadata

Metadata

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions