Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-12624

When schema is specified, we should give better error message if actual row length doesn't match

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Blocker
    • Resolution: Fixed
    • None
    • 1.6.1, 2.0.0
    • PySpark, SQL
    • None

    Description

      The following code snippet reproduces this issue:

      from pyspark.sql.types import StructType, StructField, IntegerType, StringType
      from pyspark.sql.types import Row
      
      schema = StructType([StructField("a", IntegerType()), StructField("b", StringType())])
      rdd = sc.parallelize(range(10)).map(lambda x: Row(a=x))
      df = sqlContext.createDataFrame(rdd, schema)
      df.show()
      

      An unintuitive ArrayIndexOutOfBoundsException exception is thrown in this case:

      ...
      Caused by: java.lang.ArrayIndexOutOfBoundsException: 1
              at org.apache.spark.sql.catalyst.expressions.GenericInternalRow.genericGet(rows.scala:227)
              at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow$class.getAs(rows.scala:35)
              at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow$class.isNullAt(rows.scala:36)
      ...
      

      We should give a better error message here.

      Attachments

        Issue Links

          Activity

            People

              lian cheng Cheng Lian
              rxin Reynold Xin
              Votes:
              0 Vote for this issue
              Watchers:
              8 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: