When setting a large number as the value for a Range (cell), the number that is written to the spreadsheet is different than the original number. E.g.
If I set 42300000000, the number in excel becomes -649672960. This doesn't happen with smaller numbers
I tested it with the basic project sample from Visual Studio. Just replaced the original loadSampleData function with:
function loadSampleData() {
var values = [
[4230, 42300, 423000],
[4230000, 42300000, 423000000],
[4230000000, 42300000000, 423000000000]
];
// Run a batch operation against the Excel object model
Excel.run(function (ctx) {
// Create a proxy object for the active sheet
var sheet = ctx.workbook.worksheets.getActiveWorksheet();
// Queue a command to write the sample data to the worksheet
sheet.getRange("B3:D5").values = values;
// Run the queued-up commands, and return a promise to indicate task completion
return ctx.sync();
})
.catch(errorHandler);
}
When I run the add-in, I get this in Excel:
4230 42300 423000
4230000 42300000 423000000
-64967296 -649672960 2093204992
Is this some kind of overflow? Am I doing something wrong?
Thanks!
Thanks for reporting this bug.
The problem seems to be in the JSON parser that we are using to deserialize the incoming request. It incorrectly assumes that any integer number fits in int32_t
. The correct behavior would be to parse such large values as double
despite of the fact that they are integers.
Since this isn't Excel code, a fix may take a long time.
Unfortunately, appending .0
or E0
at the end of these literals, doesn't drive the parser to parse these literals as double
. As Charles Williams pointed out, enclosing the literals in single or double quotes serves your purpose [for some unknown reason].
I haven't been able to find a more deterministic work around. Other suggestions will be welcome.
Zlatko