Hello All! Loading huge excel file 35k lines (api_datasource.importFromFile) - what to do? are there any parallel execution features or some oracle hacks to speed up the process? Or probably there is some max limit number of rows to be proceeded by importFromFile? Regards, Dmitry. |
Hi Dmitry, There is not a feature to speed up the current process. Also, there is not a predefined limit for the number of rows to be proceeded. Are you experiencing any problem during the import? Regards, |
Hello, sorry for long reply. Finally I ve managed to load a file using importFromFile tool. 6M Excel file with 35k rows inside loaded for about 10 minutes which is acceptable. But another issue is that is seems that the most of browsers (except Firefox) have some kind of a timeout. I have some further steps after the end of the load (for example a message that everything loaded) but they never fire. If I load a smaller file size - everything goes ok. So the question is if we can commit after some number of rows loaded or/and if there is any workaround for the "browser not responding" issue? Best regards, Dmitry |
Excel file have simple structure - 11 columns: 1st number, 2nd - date, and 9 strings. 35к rows. Table for import have 11 fields of varchar2(256). Code: l_ColumnNames api_datasource.tt_datasourceColumnNames; l_blob blob; begin ................ l_ColumnNames(1) := 'FIELD1'; l_ColumnNames(2) := 'FIELD2'; l_ColumnNames(3) := 'FIELD3'; l_ColumnNames(4) := 'FIELD4'; l_ColumnNames(5) := 'FIELD5'; l_ColumnNames(6) := 'FIELD6'; l_ColumnNames(7) := 'FIELD7'; l_ColumnNames(8) := 'FIELD8'; l_ColumnNames(9) := 'FIELD9'; l_ColumnNames(10) := 'FIELD10'; l_ColumnNames(11) := 'FIELD11'; api_datasource.importFromFile(in_datasourceName_tx => 'ds_excel_data', in_file_bl => l_blob, in_fileFormat_cd => api_datasource.FILE_FORMAT_XLS, in_datasourceColumnNames_t => l_ColumnNames); .................. end; |
Hello Team, are there any updates regarding "browser not responding" issue? (please see the original question from Dmitry) Best regards, Anatoly Hi Anatoly, That seems to be a Google Chrome bug. We are looking for a workaround. We'll update that thread when we have one. Kind Regards,
(16 Dec '13, 09:48)
Yalim Gerger ♦♦
|
sorry for resurrecting this thread but... I have an xls file which is 3mb in size and has 22k rows in it with 10 fields. appears to work, but takes a really long time, 24 minutes is this about normal??? Simon Hi Simon, 24 minutes time for this seems odd to me. Does it include an upload time if you're using an fileUpload component maybe? If so, is it possible to check what the upload time is by commenting out the api_datasource.importFromFile API call from your procedure? Kind regards,
(13 Jun '16, 08:49)
Serdar ♦♦
see info below, couldn't format the code in this comment :)
(13 Jun '16, 09:50)
apacheuk
|
Added some debug to the code to get some timings results below as well as the relevent code...
debug output
As you can see its the actual importfromfile that takes the time the actual upload takes no time at all. |
any update on this performance issue? Hi Simon, We're on this issue. I'll let you know as soon as we found something. By the way, can you share your excel file via e-mail if it is possible? Kind Regrads Serdar
(23 Jun '16, 06:46)
Serdar ♦♦
There is no way for me to do that due to Uk security restrictions, I can probably tell you what the column data types are and how many we have and the total number of rows if that would help?
(23 Jun '16, 07:49)
apacheuk
|
Still have an issue with this and was looking into it again and noticed we are using api_datasource.FILE_FORMAT_XLS as the file format, I noticed that there is also a api_datasource.FILE_FORMAT_XLSX in the api, could just be a red hearing but figured I try it Whenever I set to api_datasource.FILE_FORMAT_XLSX (save the file in the right format) and try and import I get the following error :-
does the api_datasource.FILE_FORMAT_XLSX option not work? |
been trying to get to the bottom of this performance issue and if it helps here is the out from a trace for the query that I think is causing the the performance issue, I believe this is where is parsing the XML
|