Details
-
Sub-task
-
Status: Closed
-
Minor
-
Resolution: Duplicate
-
2.2.0
-
None
-
None
-
- Linux
- Scala: 2.11.8
- Spark: 2.2.0
Description
When running the below application, Janino throws StackOverflow:
Exception in thread "main" java.lang.StackOverflowError
at org.codehaus.janino.CodeContext.flowAnalysis(CodeContext.java:370)
at org.codehaus.janino.CodeContext.flowAnalysis(CodeContext.java:541)
at org.codehaus.janino.CodeContext.flowAnalysis(CodeContext.java:541)
at org.codehaus.janino.CodeContext.flowAnalysis(CodeContext.java:541)
at org.codehaus.janino.CodeContext.flowAnalysis(CodeContext.java:541)
Problematic code:
Example.scala
import org.apache.spark.sql._ case class Foo( f1: Int = 0, f2: Int = 0, f3: Int = 0, f4: Int = 0, f5: Int = 0, f6: Int = 0, f7: Int = 0, f8: Int = 0, f9: Int = 0, f10: Int = 0, f11: Int = 0, f12: Int = 0, f13: Int = 0, f14: Int = 0, f15: Int = 0, f16: Int = 0, f17: Int = 0, f18: Int = 0, f19: Int = 0, f20: Int = 0, f21: Int = 0, f22: Int = 0, f23: Int = 0, f24: Int = 0 ) case class Nest[T]( a: T, b: T ) object Nest { def apply[T](t: T): Nest[T] = new Nest(t, t) } object Main { def main(args: Array[String]) { val spark: SparkSession = SparkSession.builder().appName("test").master("local[*]").getOrCreate() import spark.implicits._ val foo = Foo(0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0) Seq.fill(10)(Nest(Nest(foo))).toDS.groupByKey(identity).count.map(s => s).collect } }
Attachments
Issue Links
- duplicates
-
SPARK-21720 Filter predicate with many conditions throw stackoverflow error
- Resolved