Uploaded image for project: 'Apache Drill'
  1. Apache Drill
  2. DRILL-8511

Overflow appeared when the batch reached rows limit

    XMLWordPrintableJSON

Details

    • Bug
    • Status: In Progress
    • Major
    • Resolution: Unresolved
    • 1.21.2
    • None
    • None
    • None

    Description

       

      Drill fails to read a JSON file with the exception: java.lang.IllegalStateException: Unexpected state: FULL_BATCH:

      Caused by: java.lang.IllegalStateException: Unexpected state: FULL_BATCH
              at org.apache.drill.exec.physical.resultSet.impl.ResultSetLoaderImpl.overflowed(ResultSetLoaderImpl.java:639)
              at org.apache.drill.exec.physical.resultSet.impl.ColumnState$PrimitiveColumnState.overflowed(ColumnState.java:73)
              at org.apache.drill.exec.vector.accessor.writer.BaseScalarWriter.overflowed(BaseScalarWriter.java:214)
              at org.apache.drill.exec.vector.accessor.writer.AbstractFixedWidthWriter.resize(AbstractFixedWidthWriter.java:249)
              at org.apache.drill.exec.vector.accessor.writer.BitColumnWriter.prepareWrite(BitColumnWriter.java:77)
              at org.apache.drill.exec.vector.accessor.writer.BitColumnWriter.setValueCount(BitColumnWriter.java:87)
              at org.apache.drill.exec.vector.accessor.writer.AbstractFixedWidthWriter.endWrite(AbstractFixedWidthWriter.java:299)
              at org.apache.drill.exec.vector.accessor.writer.NullableScalarWriter.endWrite(NullableScalarWriter.java:298)
              at org.apache.drill.exec.vector.accessor.writer.AbstractTupleWriter.endWrite(AbstractTupleWriter.java:366)
              at org.apache.drill.exec.physical.resultSet.impl.RowSetLoaderImpl.endBatch(RowSetLoaderImpl.java:101)
              at org.apache.drill.exec.physical.resultSet.impl.ResultSetLoaderImpl.harvestNormalBatch(ResultSetLoaderImpl.java:730)
              at org.apache.drill.exec.physical.resultSet.impl.ResultSetLoaderImpl.harvest(ResultSetLoaderImpl.java:700)
              at org.apache.drill.exec.physical.impl.scan.project.ReaderSchemaOrchestrator.endBatch(ReaderSchemaOrchestrator.java:137)
              at org.apache.drill.exec.physical.impl.scan.framework.ShimBatchReader.next(ShimBatchReader.java:148)
              at org.apache.drill.exec.physical.impl.scan.ReaderState.readBatch(ReaderState.java:400)
              at org.apache.drill.exec.physical.impl.scan.ReaderState.next(ReaderState.java:361)
              at org.apache.drill.exec.physical.impl.scan.ScanOperatorExec.nextAction(ScanOperatorExec.java:270)
              at org.apache.drill.exec.physical.impl.scan.ScanOperatorExec.next(ScanOperatorExec.java:242)
              at org.apache.drill.exec.physical.impl.protocol.OperatorDriver.doNext(OperatorDriver.java:201)
              at org.apache.drill.exec.physical.impl.protocol.OperatorDriver.start(OperatorDriver.java:179)
              at org.apache.drill.exec.physical.impl.protocol.OperatorDriver.next(OperatorDriver.java:129)
              at org.apache.drill.exec.physical.impl.protocol.OperatorRecordBatch.next(OperatorRecordBatch.java:149)
              at org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:109)
              at org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:101)
              at org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext(AbstractUnaryRecordBatch.java:59)
              at org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext(ProjectRecordBatch.java:93)
              at org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:161)
              at org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:103)
              at org.apache.drill.exec.physical.impl.ScreenCreator$ScreenRoot.innerNext(ScreenCreator.java:81)
              at org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:93)
              at org.apache.drill.exec.work.fragment.FragmentExecutor.lambda$run$0(FragmentExecutor.java:324)
              at .......(:0)
              at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:2012)
              at org.apache.drill.exec.work.fragment.FragmentExecutor.run(FragmentExecutor.java:313)
              at org.apache.drill.common.SelfCleaningRunnable.run(SelfCleaningRunnable.java:38)
              at .......(:0) 

      Overflow appeared when the batch reached the rows limit with JSON reader.

      To reproduce the issue - execute the following query against the attached file:

       

      SELECT id, 
               gbyi, 
               gbyt, 
               fl, 
               nul, 
               bool, 
               str, 
               sia, 
               sfa, 
               soa, 
               ooa, 
               oooi, 
               ooof, 
               ooos, 
               oooa 
        FROM   dfs.tmp.`complex.json` 

       

       

       

      Attachments

        1. complex.zip
          10.75 MB
          Maksym Rymar

        Activity

          People

            rymarm Maksym Rymar
            rymarm Maksym Rymar
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated: