Skip to content

Perform a similarity check + inference test after converting to onnx #2585

@joeyballentine

Description

@joeyballentine

The main con of this would be that then onnx inference would be required for conversion. Maybe it could just be an optional thing if onnx is installed.

Why is this necessary? According to musl:

[Check] If the converted onnx is relatively close to the outputs of pytorch inference. This is not strictly necessary, but the documentation recommends it (np.testing.assert_allclose). It obviously won't change the model, but I found some situations where the converted onnx gave like 20% difference between pytorch and onnx, which should not be acceptable. So if this is not tested with assert_allclose, the outputs could be bad while still exporting just fine.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions