I have a page to upload a file using apex_data_parser when the user click the button.
I had tested with the same file 750K rows and 40 MB (I delete the column with some description to stay bellow 50 MB) took 15 min
Somebody have any ideia why the time is so different with the file above 50 MB?
Reading the API the limit is 2GB and 4GB to the file
The SQL I took from the sample upload in app App Gallery
FOR r_temp in (SELECT line_number, col001, col002, col003, col004, col005, col006, col007, col008
from apex_application_temp_files f,
table( apex_data_parser.parse(
p_content => f.blob_content,
p_file_type => 2,
p_skip_rows => 1,
p_file_name => f.filename ) ) p
where f.name = p_name)
LOOP
...
Column Position Column Name Data Type Format Mask
1 LINE NUMBER -
2 ACCOUNT NUMBER -
3 DATETIME_CALL DATE YYYY"-"MM"-"DD" "HH24":"MI":"SS
4 TYPE_CALL VARCHAR2(255) -
5 CALL NUMBER -
6 DURATION NUMBER -
7 UNIT VARCHAR2(50) -
8 PRICE NUMBER -
To simplifie the problem I changed the sql statement to a simple count(*).
I have create a demo account at oracle cloud and started a Autonomous Transaction Processing using the same file, same appplication to test. The results: File greater than 50MB 6 hours to execute a SQL count statement (see attachament bellow). File with 48MB 3 minutes to execute the same SQL count statement. Maybe a apex.parser limit?
This chart below is interesting, the User I/O goes up a lot, only with > 50 MB in my tests. I took the 50 MB file that has processed OK in 3 minutes and copy some rows to increase until 70 MB (so the file is not corrupt)
I believe the answer can be found in this question: Oracle APEX apex_data_parser execution time explodes for big files