You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Nov 19, 2020. It is now read-only.
I have been working with the following code snippet and getting erratic results. I suspect there may be something wrong with the PolynomialLeastSquares() class of I just don't fully understand what it does.
If I run this:
Period = 14;
inputs = new double[Period];
outputs = new double[Period];
ols = new PolynomialLeastSquares();
ols.Degree = 3;
for(i = 0; i < Period; i++)
{
inputs[i] = Convert.ToDouble(i);
outputs[i] = inputs[i];
}
// Use OLS to learn the regression
reg = ols.Learn(inputs, outputs);
result = reg.Transform(-1.0);
Print(ols.Degree+" "+reg.ToString()+" Int "+ reg.Intercept);
This is what is returned:
3 y(x) = 2.37812678233955E-17x^2 + 1x^1 + 2.77718345656259x^0 Int -2.77718345656258
Some things look funny here to me in building the X matrix, I don't see the 1's for the X^0 parameter (maybe you handle that elsewhere), but also the Pow() function seems to never put in the X^Degree factor, but rather an X^(Degree-1), not sure if this helps but if you could help advice when you get time, I would appreciate it.