Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-26675

Error happened during creating avro files

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Incomplete
    • 2.4.0
    • None
    • PySpark, SQL

    Description

      Run cmd

      spark-submit --packages org.apache.spark:spark-avro_2.11:2.4.0 /nke/reformat.py
      

      code in reformat.py

      df = spark.read.option("multiline", "true").json("file:///nke/example1.json")
      
      df.createOrReplaceTempView("traffic")
      
      a = spark.sql("""SELECT store.*, store.name as store_name, store.dataSupplierId as store_dataSupplierId, trafficSensor.*,
      trafficSensor.name as trafficSensor_name, trafficSensor.dataSupplierId as trafficSensor_dataSupplierId, readings.*
      FROM (SELECT explode(stores) as store, explode(store.trafficSensors) as trafficSensor,
      explode(trafficSensor.trafficSensorReadings) as readings FROM traffic)""")
      
      b = a.drop("trafficSensors", "trafficSensorReadings", "name", "dataSupplierId")
      
      b.write.format("avro").save("file:///nke/curated/namesAndFavColors.avro")
      

      Error message:

      Traceback (most recent call last):
      File "/nke/reformat.py", line 18, in <module>
      b.select("store_name", "store_dataSupplierId").write.format("avro").save("file:///nke/curated/namesAndFavColors.avro")
      File "/usr/spark-2.4.0/python/lib/pyspark.zip/pyspark/sql/readwriter.py", line 736, in save
      File "/usr/spark-2.4.0/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__
      File "/usr/spark-2.4.0/python/lib/pyspark.zip/pyspark/sql/utils.py", line 63, in deco
      File "/usr/spark-2.4.0/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value
      py4j.protocol.Py4JJavaError: An error occurred while calling o45.save.
      : java.lang.NoSuchMethodError: org.apache.avro.Schema.createUnion([Lorg/apache/avro/Schema;)Lorg/apache/avro/Schema;
      at org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:185)
      at org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:176)
      at org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:174)
      at scala.collection.Iterator$class.foreach(Iterator.scala:891)
      at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
      at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
      at org.apache.spark.sql.types.StructType.foreach(StructType.scala:99)
      at org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:174)
      at org.apache.spark.sql.avro.AvroFileFormat$$anonfun$7.apply(AvroFileFormat.scala:118)
      at org.apache.spark.sql.avro.AvroFileFormat$$anonfun$7.apply(AvroFileFormat.scala:118)
      at scala.Option.getOrElse(Option.scala:121)
      at org.apache.spark.sql.avro.AvroFileFormat.prepareWrite(AvroFileFormat.scala:118)
      at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:103)
      at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:159)
      at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104)
      at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102)
      at org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:122)
      at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
      at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
      at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
      at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
      at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
      at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
      at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
      at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
      at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:668)
      at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:668)
      at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
      at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
      at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
      at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:668)
      at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:276)
      at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:270)
      at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:228)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      at java.lang.reflect.Method.invoke(Method.java:498)
      at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
      at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
      at py4j.Gateway.invoke(Gateway.java:282)
      at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
      at py4j.commands.CallCommand.execute(CallCommand.java:79)
      at py4j.GatewayConnection.run(GatewayConnection.java:238)
      at java.lang.Thread.run(Thread.java:748)

       

      Attachments

        Activity

          People

            Unassigned Unassigned
            tony0918 Tony Mao
            Votes:
            3 Vote for this issue
            Watchers:
            7 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: