Details
-
Improvement
-
Status: Closed
-
Major
-
Resolution: Cannot Reproduce
-
1.8.6
-
None
-
None
-
Groovy script on a Solaris server, Oracle 10g server on the same subnet.
Description
I did some performance test with two sql inserts file, both files have the same 10k records. The first file I have manually put a COMMIT statement after 1000 records. Then I took the first file and execute by "time sqlplus user/pass@db @1st_inserts.sql > /dev/null", the time sqlplus finish loading all the records is 13 seconds.
On the 2nd file, I run it by my groovy script like:
c = 1 ticks = System.currentTimeMillis() to_db.withBatch(1000) { stmt -> new File("2nd_inserts.sql).eachLine { line -> stmt.addBatch(line) c++ } } ticks = System.currentTimeMillis() - ticks rec_count = (c-1) out << "\n$rec_count rows imported, elapsed ${ticks}ms\n"
Using my groovy script to load the same 10k records, it took 28 seconds. What do I need to do to improve the batch inserts performance. I am tryng to avoid calling sqlplus within my Groovy script.
Thanks,
Edmond