Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-22510 Exceptions caused by 64KB JVM bytecode or 64K constant pool entry limit
  3. SPARK-22523

Janino throws StackOverflowError on nested structs with many fields

    XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Closed
    • Minor
    • Resolution: Duplicate
    • 2.2.0
    • None
    • Spark Core, SQL
    • None
      • Linux
      • Scala: 2.11.8
      • Spark: 2.2.0

    Description

      When running the below application, Janino throws StackOverflow:

      Exception in thread "main" java.lang.StackOverflowError
      	at org.codehaus.janino.CodeContext.flowAnalysis(CodeContext.java:370)
      	at org.codehaus.janino.CodeContext.flowAnalysis(CodeContext.java:541)
      	at org.codehaus.janino.CodeContext.flowAnalysis(CodeContext.java:541)
      	at org.codehaus.janino.CodeContext.flowAnalysis(CodeContext.java:541)
      	at org.codehaus.janino.CodeContext.flowAnalysis(CodeContext.java:541)
      

      Problematic code:

      Example.scala
      import org.apache.spark.sql._
      
      case class Foo(
        f1: Int = 0,
        f2: Int = 0,
        f3: Int = 0,
        f4: Int = 0,
        f5: Int = 0,
        f6: Int = 0,
        f7: Int = 0,
        f8: Int = 0,
        f9: Int = 0,
        f10: Int = 0,
        f11: Int = 0,
        f12: Int = 0,
        f13: Int = 0,
        f14: Int = 0,
        f15: Int = 0,
        f16: Int = 0,
        f17: Int = 0,
        f18: Int = 0,
        f19: Int = 0,
        f20: Int = 0,
        f21: Int = 0,
        f22: Int = 0,
        f23: Int = 0,
        f24: Int = 0
      )
      
      case class Nest[T](
        a: T,
        b: T
      )
      
      object Nest {
        def apply[T](t: T): Nest[T] = new Nest(t, t)
      }
      
      object Main {
        def main(args: Array[String]) {
          val spark: SparkSession = SparkSession.builder().appName("test").master("local[*]").getOrCreate()
          import spark.implicits._
      
          val foo = Foo(0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)
      
          Seq.fill(10)(Nest(Nest(foo))).toDS.groupByKey(identity).count.map(s => s).collect
        }
      }
      

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              utdemir Utku Demir
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: