Good day Steven. On 10g and higher,we still have to convert to BULK COLLECT (with LIMIT) if we are retrieving high volume of records due to the PGA memory issues right? (I’m referring to the section ‘When to convert to BULK COLLECT’). Or is it taken care of by the Optimizer??
That's correct. If you do a SELECT BULK COLLECT INTO without LIMIT, there is no way for PL/SQL to do "pagination". You are filling up a collection once and then iterating through that collection.
Does data flushes from memory using Limit clause, I have following code. Will it work for 1 billion records. My question is that will all data remain in memory after completion of loop. OPEN CUR_CQ; LOOP FETCH CUR_CQ BULK COLLECT INTO CQ_TABLE$ LIMIT 100; EXIT WHEN CUR_CQ%NOTFOUND; END LOOP; CLOSE CUR_CQ;
Thanks Steve, as always...
Mind blowing explaination
thank you sir
excellent, thank you!
Good day Steven. On 10g and higher,we still have to convert to BULK COLLECT (with LIMIT) if we are retrieving high volume of records due to the PGA memory issues right? (I’m referring to the section ‘When to convert to BULK COLLECT’). Or is it taken care of by the Optimizer??
That's correct. If you do a SELECT BULK COLLECT INTO without LIMIT, there is no way for PL/SQL to do "pagination". You are filling up a collection once and then iterating through that collection.
Got it.. Thanks Steve
have any opportunity for 3+ years PLSQL developer
Does data flushes from memory using Limit clause, I have following code. Will it work for 1 billion records.
My question is that will all data remain in memory after completion of loop.
OPEN CUR_CQ;
LOOP
FETCH
CUR_CQ BULK COLLECT
INTO
CQ_TABLE$ LIMIT 100;
EXIT
WHEN CUR_CQ%NOTFOUND;
END LOOP;
CLOSE CUR_CQ;