I have (at the time) five coloumns in Excel which I need to read and store into Matlab variables. I currently use the following code:
TE=xlsread('../input/input.xlsx','A:A');
AF=xlsread('../input/input.xlsx','B:B');
TAHE=xlsread('../input/input.xlsx','C:C');
HD=xlsread('../input/input.xlsx','D:D');
TCW=xlsread('../input/input.xlsx','E:E');
This takes 11 seconds, when the input.xlsx contains 14 rows. When using 8760 rows (which will be the number of rows in my final inputxlsx), the consumed time is about the same.
The bottleneck seems to be opening the Excel file. Am I right? How can I minimize the time consumption?
To me, it seems like Matlab opens the Excel file five times, when only one seems necessary. How can I improve my code?
EDIT: By using the following code, the time consumption was reduced by about 2 seconds (still rather slow):
temp=xlsread('../input/input.xlsx','A:E');
TE=temp(:,1);
AF=temp(:,2);
TAHE=temp(:,3);
HD=temp(:,4);
TCW=temp(:,5);
From the xlsread
documentation:
num = xlsread(filename,sheet,xlRange,'basic')
reads data from the spreadsheet in basic mode, the default on systems without Excel for Windows. If you do not specify all the arguments, use empty strings as placeholders, for example,num = xlsread(filename,'','','basic')
.
My understanding of this is that on Windows machines with Excel installed, MATLAB actually calls Excel and lets it read the data and pass them to MATLAB, whereas otherwise (without Excel, without Windows or with explicit 'basic'
mode) the file is read by a native MATLAB implementation, which may be faster because the Excel startup alone may take some time.