Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-16624

Generated SpecificColumnarIterator code can exceed JVM size limit for cached DataFrames

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • 1.6.2
    • None
    • SQL
    • None
    • ./bin/spark-sql --master=yarn --queue queue3 --driver-memory 4g --driver-java-options -XX:MaxPermSize=1g --num-executors 8 --executor-memory 4g --conf spark.yarn.am.memory=4096m --hiveconf hive.cli.print.header=false -S -e “large sql command”

    Description

      java.lang.Exception: failed to compile: org.codehaus.janino.JaninoRuntimeException: Code of method "(Lorg/apache/spark/sql/catalyst/expressions/GeneratedClass$SpecificUnsafeProjection;Lorg/apache/spark/sql/catalyst/InternalRow;)V" of class "org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection" grows beyond 64 KB

      SQL: sql size 12K
      data:more tha 300 million the data

      Attachments

        Issue Links

          Activity

            There are no comments yet on this issue.

            People

              Unassigned Unassigned
              wuzhilon88 吴志龙
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: