Uploaded image for project: 'Zeppelin'
  1. Zeppelin
  2. ZEPPELIN-4116

SQL UDF do not work with Spark New Interpreter

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Major
    • Resolution: Cannot Reproduce
    • Affects Version/s: 0.8.1
    • Fix Version/s: None
    • Component/s: zeppelin-interpreter
    • Labels:
      None
    • Environment:

      Zeppelin 0.8.1.  Spark 2.2.2 . RHEL 7.

      Description

      If I define a Spark UDF such as the following:

      import org.apache.spark.sql.SQLContext
      import org.apache.spark.sql.functions._
      import spark.implicits._
      
      val data = sqlContext.range(0, 5)
      val testudf = udf { (num: Integer) => num * num }
      val udfcol = data.withColumn("id_square", testudf($"id"))
      
      udfcol.show()
      

       

      With the new interpreter, I randomly get an error similar to SPARK-9219.  Setting zeppelin.spark.useNew to false appears to consistently work correctly

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                Unassigned
                Reporter:
                mbidewel Mark Bidewell
              • Votes:
                0 Vote for this issue
                Watchers:
                3 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: