Details

    • Type: Improvement Improvement
    • Status: Closed
    • Priority: Major Major
    • Resolution: Cannot Reproduce
    • Affects Version/s: 1.8.6
    • Fix Version/s: None
    • Component/s: SQL processing
    • Labels:
      None
    • Environment:
      Groovy script on a Solaris server, Oracle 10g server on the same subnet.

      Description

      I did some performance test with two sql inserts file, both files have the same 10k records. The first file I have manually put a COMMIT statement after 1000 records. Then I took the first file and execute by "time sqlplus user/pass@db @1st_inserts.sql > /dev/null", the time sqlplus finish loading all the records is 13 seconds.

      On the 2nd file, I run it by my groovy script like:

      c = 1
      ticks = System.currentTimeMillis()
      
        to_db.withBatch(1000)
        {
          stmt ->
          new File("2nd_inserts.sql).eachLine
          { line ->
            stmt.addBatch(line)
            c++
          }
        }
      
      ticks = System.currentTimeMillis() - ticks
      
      rec_count = (c-1)
      
      out << "\n$rec_count rows imported, elapsed ${ticks}ms\n"
      

      Using my groovy script to load the same 10k records, it took 28 seconds. What do I need to do to improve the batch inserts performance. I am tryng to avoid calling sqlplus within my Groovy script.

      Thanks,

      Edmond

        Activity

        No work has yet been logged on this issue.

          People

          • Assignee:
            Pascal Schumacher
            Reporter:
            Edmond Wong
          • Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development