Skip to content

Squeeze-11's output shape is mistakenly inferred when input has dynamic axes and squeezing axes is not specified #6313

@Yosshi999

Description

@Yosshi999

Bug Report

Is the issue related to model conversion?

No.

Describe the bug

When Squeeze-11 doesn't have the attribute axes, shape-inference must be canceled. However, it seems that the output shape of Squeeze is inferred as scalar.

import onnx
from onnx import TensorProto
from onnx.helper import (
    make_model, make_node, make_graph, make_attribute,
    make_tensor_value_info)
from onnx.checker import check_model
import onnxruntime

print(onnx.__version__, onnxruntime.__version__)

def create_SqUsq_model(target_shape) -> bytes:
    X = make_tensor_value_info("X", TensorProto.FLOAT, target_shape)
    Y = make_tensor_value_info("Y", TensorProto.FLOAT, target_shape)
    node1 = make_node("Squeeze", ["X"], ["A"])
    node2 = make_node("Unsqueeze", ["A"], ["Y"])
    node2.attribute.append(make_attribute("axes", [1]))
    graph = make_graph([node1, node2], "sample", [X], [Y])
    model = make_model(graph, opset_imports=[onnx.helper.make_operatorsetid("", 12)])
    check_model(model)
    return model.SerializeToString()

model = create_SqUsq_model([5, 1])
sess = onnxruntime.InferenceSession(model, providers=["CPUExecutionProvider"]) # it works

model = create_SqUsq_model([5, None])
sess = onnxruntime.InferenceSession(model, providers=["CPUExecutionProvider"]) # raise error

# outputs:
# 1.16.2 1.19.0
# Traceback (most recent call last):
#   File "C:\Users\yoshi\Desktop\vvinfer27\gen.py", line 23, in <module>
#     sess = onnxruntime.InferenceSession(model, providers=["CPUExecutionProvider"])
#            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#   File "C:\Users\yoshi\miniconda3\envs\onnx\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 419, in __init__
#     self._create_inference_session(providers, provider_options, disabled_optimizers)
#   File "C:\Users\yoshi\miniconda3\envs\onnx\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 482, in _create_inference_session
#     sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
#            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
# onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node () Op (Unsqueeze) [ShapeInferenceError] values in 'axes' are beyond the bounds of the computed output shape

System information

  • Windows 11 Pro
  • ONNX 1.16.2
  • ONNXRuntime 1.19.0

Expected behavior

No error raised.

Notes

According to my fellow's research, this bug is reproduced since #5967 .

Possibly https://github.com/onnx/onnx/blob/v1.16.2/onnx/defs/tensor/old.cc#L2661 works uncorrectly.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions