Hello, I’m completely new to Math.Net and hoping for some guidance on how to solve a particular problem I’m having.

Essentially, what I’m wanting to do is recreate the Solver functionality from Excel in C#. The Excel Solver uses a Generalized Reduced Gradient (GRG2) Algorithm to minimize an error value by modifying several dependent variables.

More specifically, the error value is calculated as such:

errror = √ (Σ(calculated_value - measured_value)^2) / number_of_measurements )

I want to determine values for the dependent variables to minimize the error value.

The function to calculate `calculated_value`

for each measurement is very complex, so I don’t know if it’s possible to generate things like gradients and partial derivatives which seem to be needed for many nonlinear least squares algorithms.

I’m essentially looking for a function that can find values for the dependent variables by minimizing the calculated overall error as described above.

Hopefully I’ve given enough information. It’s probably obvious that I don’t have a lot of technical knowledge in this domain right now.