Skip to content
This repository was archived by the owner on Nov 19, 2020. It is now read-only.
This repository was archived by the owner on Nov 19, 2020. It is now read-only.

AugmentedLagrangian fails on standard convex quadratic problem #431

@allmenaremortal

Description

@allmenaremortal

Example 16.4 in the book "Numerical Optimization" by Nocedal & Wright yields an example of a convex quadratic problem in two dimensions with five linear inequality constraints. The solution is (1.4, 1.7), which is both stated in the book and can be recovered using the Goldfarb & Idnani algorithm. This type of problem is within the range of problems which one might expect the AugmentedLagrangian solver to solve correctly. However, the solver reports a solution of (0.45, 0.63), which seems to indicate an issue with the class.

Here is a C# unit test demonstrating the issue:

    [TestMethod]
    [ExpectedException(typeof(AssertFailedException))]
    public void AugmentedLagrangianSolverTest02()
    {

        // Ensure that the Accord.NET random generator starts from a particular fixed seed.
        Accord.Math.Random.Generator.Seed = 0;

        // The minimization problem is to minimize the function (x_0 - 1)^2 + (x_1 - 2.5)^2$ subject
        // to the five constraints $x_0 - 2x_1 +2 \ge 0$, $-x_0 - 2x_1 + 6 \ge0$, $-x_0 + 2x_1 + 2\ge0$,
        // $x_0\ge0$ and $x_1\ge0$.

        var f = new NonlinearObjectiveFunction(2,
            function: (x) => (x[0] - 1.0) * (x[0] - 1.0) + (x[1] - 2.5) * (x[1] - 2.5),
            gradient: (x) => new[] { 2.0 * (x[0] - 1.0), 2.0 * (x[1] - 2.5) } );

        var constraints = new List<NonlinearConstraint>();

        // Add the constraint $x_1 - 2x_2 + 2 \ge0$.
        constraints.Add(new NonlinearConstraint(f,
            function: (x) => x[0] - 2.0 * x[1] + 2.0,
            gradient: (x) => new[] { 1.0, -2.0 },
            shouldBe: ConstraintType.GreaterThanOrEqualTo, value: 0 ));

        // Add the constraint $-x_0 - 2x_1 + 6 \ge 0$.
        constraints.Add(new NonlinearConstraint(f,
            function: (x) => - x[0] - 2.0 * x[1] + 6.0,
            gradient: (x) => new[] { -1.0, -2.0 },
            shouldBe: ConstraintType.GreaterThanOrEqualTo, value: 0));

        // Add the constraint $-x_0 + 2x_1 + 2 \ge 0$.
        constraints.Add(new NonlinearConstraint(f,
            function: (x) => -x[0] + 2.0 * x[1] + 2.0,
            gradient: (x) => new[] { -1.0, 2.0 },
            shouldBe: ConstraintType.GreaterThanOrEqualTo, value: 0));

        // Add the constraint $x_0  \ge 0$.
        constraints.Add(new NonlinearConstraint(f,
            function: (x) => x[0],
            gradient: (x) => new[] { 1.0, 0.0 },
            shouldBe: ConstraintType.GreaterThanOrEqualTo, value: 0));

        // Add the constraint $x_1  \ge 0$.
        constraints.Add(new NonlinearConstraint(f,
            function: (x) => x[1],
            gradient: (x) => new[] { 0.0, 1.0 },
            shouldBe: ConstraintType.GreaterThanOrEqualTo, value: 0));

        var solver = new AugmentedLagrangian(f, constraints);

        Assert.IsTrue(solver.Minimize());
        double minValue = solver.Value;

        Assert.IsFalse(Double.IsNaN(minValue));

        // According to the example, the solution is $(1.4, 1.7)$.
        Assert.AreEqual(1.4, solver.Solution[0], 1e-5);
        Assert.AreEqual(1.7, solver.Solution[1], 1e-5);
    }

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions