My code creates over ~40 different arrays across many subroutines, each one being 1-dimensional using the best datatype for memory.
Dim N As Long
N = 40000
Dim my_array() As Double
ReDim my_array(1 To N)
Dim i as Long
For i = 1 to N
my_array(i) = Rnd
Next i
In most cases, the process runs at ~ 40,000 rows, and I can write to worksheets fine using transpose
rng.Resize(N, 1).Value = Application.Worksheet.Transpose(my_array)
But now that I am have scenarios over ~ 65k rows, up to 200,000 in some cases, I’ve been noticing #N/As when printed to the sheet (as opposed to the documented Transpose array problem type mismatch for certain Excel Versions).
I would like to be able to optimally get the arrays onto the sheet and avoid losing information.
Ideas I have thought about:
Do I reloop all variables into a 2d variant arrays so I can write directly without transpose? This may drastically increase my memory usage.
Do I break my transposes programmatically into sectioned loops that fit the ~65k size limit? This seems cumbersome but interesting.
And then should I write a function to do either and employ it to each array or wrap all arrays into the process at the same time within the general subroutine?
Any thoughts of what I should do or code examples that have been useful to you for this issue? Thanks
To demonstrate the point Tim and I made, you should work only with 2D arrays, and do away with the 1D's alltogether. No need to conversion at all.
You'll find this faster (because of no conversion 1D to 2D) and insignificant memory use diffference.
Also, you can use the data types you want (it's Transpose
that requires a Variant
, not the placement on the sheet) And Transpose fails for row counts > ~65000
Your test code could be
Sub testing_T()
Dim N As Long
N = 100000
Dim dbl_arr() As Double
ReDim dbl_arr(1 To N, 1 To 1)
Dim i As Long
For i = 1 To N
dbl_arr(i, 1) = Rnd
Next i
Cells(2, 1).Resize(N, 1) = dbl_arr
End Sub