Uploaded image for project: 'Phoenix'
  1. Phoenix
  2. PHOENIX-6268

NoSuchMethodError when writing from Spark Dataframe to Phoenix with phoenix-spark connector

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Critical
    • Resolution: Incomplete
    • 5.0.0
    • connectors-6.0.0
    • spark-connector
    • None

    Description

      I opened a spark shell (including the phoenix-spark jar as --jars argument), loaded a dataframe (df) and wanted to store the dataframe in a phoenix server (backed by HBase).

      df.write.format("org.apache.phoenix.spark").mode(org.apache.spark.sql.SaveMode.Overwrite).options(
      Map("zkUrl" -> "<zkserver1>:<port>,<zkserver2>:<port>"
      , "table" -> "targetTablename")).save()

       

      The table exists within Phoenix.

      I get below error in spark-shell. Can you help or fix this?

      spark version is 3.0.1

      java.lang.NoSuchMethodError: 'scala.collection.mutable.ArrayOps scala.Predef$.refArrayOps(java.lang.Object[])'
      at org.apache.phoenix.spark.DataFrameFunctions.getFieldArray(DataFrameFunctions.scala:76)
      at org.apache.phoenix.spark.DataFrameFunctions.saveToPhoenix(DataFrameFunctions.scala:35)
      at org.apache.phoenix.spark.DataFrameFunctions.saveToPhoenix(DataFrameFunctions.scala:28)
      at org.apache.phoenix.spark.DefaultSource.createRelation(DefaultSource.scala:47)
      at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:46)
      at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
      at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
      at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:90)
      at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:175)
      at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:213)
      at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
      at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:210)
      at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:171)
      at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:122)
      at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:121)
      at org.apache.spark.sql.DataFrameWriter.$anonfun$runCommand$1(DataFrameWriter.scala:963)
      at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:100)
      at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:160)
      at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:87)
      at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:764)
      at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
      at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:963)
      at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:415)
      at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:399)
      ... 49 elided

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              rico.bergmann Rico Bergmann
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: