Skip to content
This repository was archived by the owner on Nov 17, 2023. It is now read-only.
This repository was archived by the owner on Nov 17, 2023. It is now read-only.

_backward_softsign activation is incorrect  #10868

@eric-haibin-lin

Description

@eric-haibin-lin

The following test case will fail:

Adding modify the test case in https://github.com/apache/incubator-mxnet/blob/master/tests/python/unittest/test_operator.py#L5867

def test_activation():
    shape=(9, 10)
    dtype_l = [np.float64, np.float32, np.float16]
    rtol_l = [1e-7, 1e-6, 1e-2]
    atol_l = [1e-7, 1e-6, 1e-2]
    rtol_fd = 1e-5
    atol_fd = 1e-6
    num_eps = 1e-6
    unary_ops = {
        'softsign': [lambda x: mx.sym.Activation(x, act_type='softsign'),
                    lambda x: x / (1. + np.abs(x)),
                    lambda x: 1. / np.square(1. + np.abs(x)),

    }
    # Loop over operators
    for name, op in unary_ops.items():
        # Loop over dtype's
        for ind in range(len(dtype_l)):
            dtype = dtype_l[ind]
            rtol = rtol_l[ind]
            atol = atol_l[ind]
            compare_forw_backw_unary_op(
                name, op[0], op[1], op[2], shape, op[3], op[4], rtol, atol,
                dtype)
        # Finite difference testing
        finite_diff_unary_op(
            name, op[0], shape, op[3], op[4], rtol_fd, atol_fd, num_eps)

Reason: For y = softsign(x), the inputs for _backward_softsign and _backward_Activation are different:
_backward_softsign takes (dy, x) as input, backward_Activation takes (dy, y) as input.

@nswamy

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions