I would like to fit A*cos^2(B+x)=y function to a few points (I’m searching for the values of A and B). I know that the frequency is 1 and there is no bias, also A is between 0 and 1 and B is between [-pi,+pi]. The other things I know: I try to take the least amount of sample points while fitting the curve as well as possible. I also know that the samples tend to be noisy at lower y values (under 0.1 and the lower the noisier).

Previously I used the Math.Net Numerics to fit a A*sin(B+x)+C=y to the samples with linear regression (built the matrix and used QR.Solve on it) and compute the cos^2 from that with trigonometric identities. In that case I took a sample at a random x and every 30 degrees from that for 4 times (so from x to x+480 degree). It didn’t yield significantly worse results than taking samples at every 5 degrees on the same range.

I think (guess) it would give better results if I fit the cos^2 instead of the sine. Can anyone point me in the right direction? I read about the new and shiny Fit.Curve function, but I have no idea how to utilise it in this case.