I'm starting to use Solver Foundation in a WPF/C# app, which should replace an Excel sheet that solves a linear problem, something simple like this:
How much do I need of each mix to get as closest as possible to (15% A + 70% B + 10% C + 5% D).
Pretty simple, even for Excel. So... I create this model in an OML string, and solve it with Solver Foundation, but the results are not the same as I get in Excel, and in every case the quadratic error I get is bigger with the Solver Foundation results (checked in the Excel sheet).
Is there any way I can configure the solver to get the same result as in Excel? If you need to see the OML, please ask and I'll update the question.
Are you sure you are attempting to minimize the same result?
Maybe the two methods are using a different difference measurement.
For instance you seem to be measuring R^2 as your solution, is that what your C# code is using as a measure of distance from perfect?