Uploaded image for project: 'Livy'
  1. Livy
  2. LIVY-600

Default UDF not found when spark job is deployed through livy

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 0.6.0
    • 0.9.0
    • API, Core
    • None
    • Centos 7 , Spark 2.3.1
    • Important

    Description

      First time invocation after deployment through livy works fine.

      However on subsequent invocation it is not able to find the default udf written inside spark sql

       

      implicit val client = new LivyClientBuilder()
      .setURI(new URI("http://ABC:8998"))
      .setConf("spark.sql.crossJoin.enabled", "true")
      etConf("spark.cassandra.connection.host",ConfigUtils.getCassConnectionHost)
      .setConf("spark.cassandra.connection.port", "9042")
      .setConf("spark.cassandra.auth.username", authDetails._1)
      .setConf("spark.cassandra.auth.password", authDetails._2)
      .setConf("spark.cassandra.output.consistency.level" ,"LOCAL_ONE")
      .setConf("spark.cassandra.output.batch.size.rows", "auto")
      .setConf("spark.cassandra.output.concurrent.writes", "500")
      .setConf("spark.cassandra.output.batch.size.bytes", "100000")
      .setConf("spark.cassandra.output.throughput_mb_per_sec", "1")
      .setConf("spark.executor.memory" ,"4G")
      .setConf("spark.sql.crossJoin.enabled", "true")
      .setConf("spark.app.name","livy_poc")
      .build();

      client.addJar(new URI("""hdfs://ABC:8020/user/livy/myJar.jar""")).get()

       

      Stack Trace

      or a permanent function registered in the database 'default'.; line 9 pos 33
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$$anonfun$apply$15$$anonfun$applyOrElse$50.apply(Analyzer.scala:1200)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$$anonfun$apply$15$$anonfun$applyOrElse$50.apply(Analyzer.scala:1200)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.analysis.package$.withPosition(package.scala:53)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$$anonfun$apply$15.applyOrElse(Analyzer.scala:1199)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$$anonfun$apply$15.applyOrElse(Analyzer.scala:1197)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$2.apply(TreeNode.scala:267)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$2.apply(TreeNode.scala:267)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:266)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformDown$1.apply(TreeNode.scala:272)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformDown$1.apply(TreeNode.scala:272)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:272)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformDown$1.apply(TreeNode.scala:272)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformDown$1.apply(TreeNode.scala:272)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:272)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$transformExpressionsDown$1.apply(QueryPlan.scala:85)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$transformExpressionsDown$1.apply(QueryPlan.scala:85)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$1.apply(QueryPlan.scala:107)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$1.apply(QueryPlan.scala:107)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpression$1(QueryPlan.scala:106)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.plans.QueryPlan.org$apache$spark$sql$catalyst$plans$QueryPlan$$recursiveTransform$1(QueryPlan.scala:118)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$org$apache$spark$sql$catalyst$plans$QueryPlan$$recursiveTransform$1$1.apply(QueryPlan.scala:122)
      19/06/12 15:26:26 INFO LineBufferedStream: at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
      19/06/12 15:26:26 INFO LineBufferedStream: at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
      19/06/12 15:26:26 INFO LineBufferedStream: at scala.collection.immutable.List.foreach(List.scala:381)
      19/06/12 15:26:26 INFO LineBufferedStream: at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
      19/06/12 15:26:26 INFO LineBufferedStream: at scala.collection.immutable.List.map(List.scala:285)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.plans.QueryPlan.org$apache$spark$sql$catalyst$plans$QueryPlan$$recursiveTransform$1(QueryPlan.scala:122)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$2.apply(QueryPlan.scala:127)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.plans.QueryPlan.mapExpressions(QueryPlan.scala:127)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionsDown(QueryPlan.scala:85)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressions(QueryPlan.scala:76)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$transformAllExpressions$1.applyOrElse(QueryPlan.scala:138)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$transformAllExpressions$1.applyOrElse(QueryPlan.scala:137)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$2.apply(TreeNode.scala:267)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$2.apply(TreeNode.scala:267)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:266)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.trees.TreeNode.transform(TreeNode.scala:256)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.plans.QueryPlan.transformAllExpressions(QueryPlan.scala:137)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$.apply(Analyzer.scala:1197)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$.apply(Analyzer.scala:1196)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:87)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:84)
      19/06/12 15:26:26 INFO LineBufferedStream: at scala.collection.IndexedSeqOptimized$class.foldl(IndexedSeqOptimized.scala:57)
      19/06/12 15:26:26 INFO LineBufferedStream: at scala.collection.IndexedSeqOptimized$class.foldLeft(IndexedSeqOptimized.scala:66)
      19/06/12 15:26:26 INFO LineBufferedStream: at scala.collection.mutable.WrappedArray.foldLeft(WrappedArray.scala:35)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:84)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:76)
      19/06/12 15:26:26 INFO LineBufferedStream: at scala.collection.immutable.List.foreach(List.scala:381)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:76)
      19/06/12 15:26:26 INFO LineBufferedStream: at org.apache.spark.sql.catalyst.analysis.Analyzer.org$apache$spark$sql$catalyst$analysis$Analyzer$$executeSameContext(Analyzer.scala:124)

       

      Attachments

        Activity

          People

            Unassigned Unassigned
            sumitchauhan Sumit Chauhan
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated: