How to use the optimization class

(Maamar Dliouah) #1

Could somebody please show me an example on how to use the optimization class, especially the GoldenSectionMinimizer ?


@maamardli you can find the simple example in the tests:

(Peter Vanderwaart) #3

The indicated link is dead. Is this example available somewhere else?

I need an example of how to create and use an function the IObjectiveFunction Interface, e.g. ConjugateGradientMinimizer.FindMinimum. I’m not familiar with delegates. Just the code for the function itself is not a help to me.

Edit: I got the program to run, but it’s not working correctly. It’s doing the same thing over and over. I believe has to do with the Gradient, for which I could find no documentation or example at all.

It’s hard to believe that it’s necessary to write the code for all the details of the IObjectiveFunction as I did. There must be a simpler way.

(Christoph Rüegg) #4


In case the file moves again:

    public void Test_Works()
        var algorithm = new GoldenSectionMinimizer(1e-5, 1000);
        var f1 = new Func<double, double>(x => (x - 3)*(x - 3));
        var obj = ObjectiveFunction.ScalarValue(f1);
        var r1 = GoldenSectionMinimizer.Minimum(obj, -100, 100);

        Assert.That(Math.Abs(r1.MinimizingPoint - 3.0), Is.LessThan(1e-4));

(Peter Vanderwaart) #5

I want to find the minimum of a function of three parameters. Based on the examples you linked to (thank you!), I’d like to mimic this example:

            var obj = ObjectiveFunction.Gradient(RosenbrockFunction.Value, RosenbrockFunction.Gradient);
            var solver = new ConjugateGradientMinimizer(1e-5, 1000);
            var result = solver.FindMinimum(obj, new DenseVector(new[]{1.2,1.2}));

The problem I’m having is that I don’t know how to write my function so I can substitute it here for the RosenbrockFunction.

(Christoph Rüegg) #6

This minimizer requires not only the function but also its gradient, i.e. the partial differentiation to all three parameters. Do you know the gradient? Otherwise it may be possible to numerically approximate it (e.g. using ForwardDifferenceGradientObjectiveFunction or with the Differentiate class).

(Peter Vanderwaart) #7

I think I got the NelderMeadSimplex (no gradient) to work. It found the right answer if the first guess was good, but failed with bad guess. I got the ConjugateGradient method to work (code fragment below) but it didn’t detect a stopping point and sometimes went off into the stratosphere. Maybe it’s a hard function.

I think I see how to compute the gradient properly. That’s something to try tomorrow.

            // create optimizer
            var f1 = new Func<Vector<double>, double>(z => LogEval(z));
            var obj = ObjectiveFunction.Value(f1);
            var fdgof = new ForwardDifferenceGradientObjectiveFunction(obj, new DenseVector(new[] { -10.0,-10.0,-10.0 }), new DenseVector(new[] { 10.0,10.0,10.0 }));
            var solver = new ConjugateGradientMinimizer(1e-3, 1000);
            var result = solver.FindMinimum(fdgof, new DenseVector(new[] { 6.0,0.4,0.006 })); 

The error function is a sum of squares:

 private double LogEval(Vector<double> v)
            double err = 0;
            for (int i = 0; i < 100; i++)
                double y_val = v[0] + v[1] * Math.Exp(v[2] * x[i]);
                err += Math.Pow(y_val - y[i], 2);
            return err;

(Peter Vanderwaart) #8

// Example with explicit objective function and gradient
// Function f = function to be minimized evaluated at a point (v)
// Gradient g = vector of partial derivatives evaluated a point (v)

        var f = new Func<Vector<double>, double>(v => Math.Pow(v[0], 2) + Math.Pow(v[1], 4) + Math.Pow(v[2], 6));              // define function
        var g = new Func<Vector<double>, Vector<double>>(v => new DenseVector(new[] { 2.0 * v[0], 4.0 * v[1], 6.0 * v[2] }));  // define grandient
        var obj = ObjectiveFunction.Gradient(f, g);
        var solver = new BfgsMinimizer(1e-5, 1e-5, 1e-5, 1000);
        var result = solver.FindMinimum(obj, new DenseVector(new[] { 15.0, 15.0, 15.0 })); // initial estimate = (15,15,15)

        // output = 8

        //DenseVector 3 - Double
        //  4.99304E-07
        //- 1.43819E-06
        //- 1.21743E-06

        //DenseVector 3 - Double
        //  9.98607E-07
        //- 5.75278E-06
        //- 7.30457E-06

        // note: failed for initial points too far from zero, i.e. much larger than (20,20,20)