Skip to content

Add version converter softmax 13 -> 12 #6608

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

seungwoo-ji-03
Copy link

Description

This PR introduces a new version adapter Softmax_13_12 to handle the conversion of the Softmax operator from opset 13 to opset 12 in the ONNX model conversion framework.

Key changes include:

  1. Correct handling of the axis attribute:
  • Converts the default axis=-1 in opset 13 to the appropriate positive axis for opset 12.
  • Converts negative axes to positive equivalents as required.
  1. Removal of additional Flatten and Reshape nodes added in opset 13:
  • Ensures these nodes are removed to align with opset 12 behavior.
  • Rewires the graph appropriately after node removal.
  1. Proper graph manipulation:
  • Cleans up unused nodes using destroy().

This adapter ensures accurate conversion between opset versions while maintaining expected Softmax behavior.

Motivation and Context

Motivation and Context

This change is required to improve the ONNX version converter’s support for models with Softmax operators. It addresses the discrepancy in axis handling and tensor coercion behavior between opset 13 and opset 12.

The PR resolves Issue #6340, which requested support for converting Softmax from opset 13 to 12.

By adding this adapter, users can now convert models containing Softmax operators from opset 13 to 12 without manual adjustments, enabling smoother workflows and improved backward compatibility.

Testing

The implementation was thoroughly tested using Python and ONNX Runtime. The following test script was used to validate the adapter’s correctness:

import onnx
from onnx import helper, TensorProto, version_converter
import numpy as np
import onnxruntime as ort

def create_softmax_model_opset13():
    """
    Create a simple ONNX model with a Softmax node in opset 13.
    """
    # Create inputs and outputs
    input_tensor = helper.make_tensor_value_info('input', TensorProto.FLOAT, [3, 4])
    output_tensor = helper.make_tensor_value_info('output', TensorProto.FLOAT, [3, 4])

    # Create a Softmax node
    softmax_node = helper.make_node(
        'Softmax',
        inputs=['input'],
        outputs=['output'],
        axis=-1
    )

    # Create the graph
    graph = helper.make_graph(
        nodes=[softmax_node],
        name='SoftmaxGraph',
        inputs=[input_tensor],
        outputs=[output_tensor]
    )

    # Create the model (opset 13)
    model = helper.make_model(
        graph,
        opset_imports=[helper.make_operatorsetid("", 13)],
        producer_name='test_softmax_opset13'
    )

    # Explicitly set IR version to 10
    model.ir_version = 10

    return model

def run_model_with_ort(model_path, input_data):
    """
    Run the ONNX model with ONNX Runtime and return the outputs.
    """
    # Create an ONNX Runtime session
    session = ort.InferenceSession(model_path)

    # Run the session
    input_name = session.get_inputs()[0].name
    outputs = session.run(None, {input_name: input_data})

    return outputs

def test_softmax_conversion_with_runtime():
    """
    Test the conversion of a Softmax model from opset 13 to opset 12
    and validate the results using ONNX Runtime.
    """
    # Step 1: Create the model
    model_opset13 = create_softmax_model_opset13()

    # Step 2: Save the model for inspection
    onnx.save(model_opset13, "softmax_opset13.onnx")

    # Step 3: Convert the model from opset 13 to opset 12
    converted_model = version_converter.convert_version(model_opset13, 12)

    # Step 4: Save the converted model for inspection
    onnx.save(converted_model, "softmax_opset12.onnx")

    # Step 5: Prepare dummy input
    input_data = np.random.rand(3, 4).astype(np.float32)

    # Step 6: Run the original model with ONNX Runtime
    original_outputs = run_model_with_ort("softmax_opset13.onnx", input_data)

    # Step 7: Run the converted model with ONNX Runtime
    converted_outputs = run_model_with_ort("softmax_opset12.onnx", input_data)

    # Step 8: Verify the outputs are identical
    np.testing.assert_allclose(
        original_outputs[0],
        converted_outputs[0],
        rtol=1e-5,
        atol=1e-6,
        err_msg="The outputs of the original and converted models do not match!"
    )

    # Step 9: Additional verification
    # Check that the converted model has the correct opset version
    assert converted_model.opset_import[0].version == 12, "Opset version is not 12!"

    print("Test passed: Softmax conversion from opset 13 to 12 is correct, and outputs match.")

if __name__ == "__main__":
    test_softmax_conversion_with_runtime()

The script verifies:

  • The correctness of the conversion by comparing outputs of the original (opset 13) and converted (opset 12) models using ONNX Runtime.
  • Proper handling of axis attribute and removal of additional Flatten and Reshape nodes.
  • The final converted model’s compliance with opset 12 specifications.

@seungwoo-ji-03 seungwoo-ji-03 requested a review from a team as a code owner January 1, 2025 09:19
@seungwoo-ji-03 seungwoo-ji-03 changed the title Feat/add version converter softmax 13 12 Add version converter softmax 13 -> 12 Jan 1, 2025
@seungwoo-ji-03 seungwoo-ji-03 force-pushed the feat/add_version_converter_softmax_13_12 branch 2 times, most recently from e5f11ad to da5bab3 Compare January 1, 2025 09:25
Copy link

codecov bot commented Jan 3, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 57.51%. Comparing base (0425433) to head (4401564).
Report is 161 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #6608      +/-   ##
==========================================
+ Coverage   57.50%   57.51%   +0.01%     
==========================================
  Files         507      507              
  Lines       31636    31645       +9     
  Branches     3048     3048              
==========================================
+ Hits        18191    18200       +9     
  Misses      12618    12618              
  Partials      827      827              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@andife andife added the run release CIs Use this label to trigger release tests in CI label Jan 3, 2025
@andife andife temporarily deployed to testpypi_onnxweekly January 3, 2025 11:23 — with GitHub Actions Inactive
@seungwoo-ji-03
Copy link
Author

@andife @justinchuby
Are there any updates or additional feedback needed to move this forward?

@justinchuby
Copy link
Member

Could you add a test here https://github.com/onnx/onnx/blob/main/onnx/test/version_converter_test.py

@seungwoo-ji-03 seungwoo-ji-03 force-pushed the feat/add_version_converter_softmax_13_12 branch from 671997c to 06d245a Compare February 17, 2025 17:05
@seungwoo-ji-03 seungwoo-ji-03 requested a review from a team as a code owner February 17, 2025 17:05
seungwoo-ji-03 added 2 commits February 18, 2025 02:11
Signed-off-by: seungwoo-ji-03 <seungwoo.ji@nuvilab.com>
Signed-off-by: seungwoo-ji <seungwoo.ji@nuvilab.com>
Signed-off-by: seungwoo-ji-03 <seungwoo.ji@nuvilab.com>
Signed-off-by: seungwoo-ji <seungwoo.ji@nuvilab.com>
@seungwoo-ji-03 seungwoo-ji-03 force-pushed the feat/add_version_converter_softmax_13_12 branch from 06d245a to d79ad56 Compare February 17, 2025 17:13
seungwoo-ji added 2 commits February 18, 2025 02:14
Copy link
Member

@justinchuby justinchuby left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@justinchuby justinchuby added this pull request to the merge queue Feb 20, 2025
Merged via the queue into onnx:main with commit 8312e1f Feb 20, 2025
35 checks passed
@ramkrishna2910 ramkrishna2910 added the module: version converter Issues related to ONNX version converter label May 12, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: version converter Issues related to ONNX version converter run release CIs Use this label to trigger release tests in CI
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

Can't reduce the version of Softmax from $13 to $12
4 participants