Fast way to dump huge Oracle table to file? - VisualCron - Forum

Community forum

I have Oracle stored procedures that "return" refcursors, then use a powershell task in VC to load them into a data set and write them to a file but that chokes on a 20 million row table.

How do you folks write huge Oracle tables to a file via VC? I don't want a method that reads data into an object first as I'm afraid of memory issues. Ideally a method that would use a powershell or .net task to read from the stored procedure's refcursor and write directly to a file would be perfect but I'm open to suggestions! Looking for best practices here I guess.

James Hagist
Have you thought about breaking it up into smaller amounts of data? Something along the likes of pagination in a REST API.

So if you have 20 million rows in your file. Break it up into 4 files of 5 million and loop through them.

Hopefully, your table has a key that you can use if not maybe you could have the DB owner build you a view that has a a row id.

Then it is as simple as a count and divide.
My last attempt of a powershell script that reads 1 million row pieces at a time into a dataset then writes using export-csv takes about 15 minutes per million. That won't do. Next try is to use streams but I need to eliminate the reading into the data set part I think.

If Toad can dump it out in minutes I know there's a faster way!

Thanks for your reply,
I used to work with Oracle but I'm pretty rusty. Anyway It seems the order performancewise (slowest to fastest) is this, according to AskTom aka Tom Kyte. AskTom 

1) utl.file, ie writing to file directly from oracle proc. Assumes you have access to oracle db. This is the slowest, which surprised me a bit.

2) using sql plus. example here Stackoverflow  . My guess is that this is what sql developer or Toad does behind the scenes

3) Pure C code. But pure C is too hard for me.

I would try sql plus, even though it is nor superelegant. I might also try C# with a datareader. Not the versions where you fill a dataset, but one where you open a datareader, and declare a fetch size and loop over. I would think it should be reasonably memory friendly. For performance you have to test a bit. See example here. Oracle docs  . Datareader can be used with refcursors also 
Thanks Thomas, my next try was going to be powershell using a datareader/writer. I'll post back results but I have some other things pressing right now so in the meantime I'll have to export manually via Toad. I'll check out the links you provided.
Scroll to Top