Details

    • Sub-task
    • Status: Open
    • Minor
    • Resolution: Unresolved
    • 3.4.0
    • None
    • SQL

    Description

      Add at least one test for the error class CANNOT_CAST_DATATYPE to QueryExecutionErrorsSuite. The test should cover the exception throw in QueryExecutionErrors:

        def cannotCastFromNullTypeError(to: DataType): Throwable = {
          new SparkException(errorClass = "CANNOT_CAST_DATATYPE",
            messageParameters = Array(NullType.typeName, to.typeName), null)
        }
      

      For example, here is a test for the error class UNSUPPORTED_FEATURE: https://github.com/apache/spark/blob/34e3029a43d2a8241f70f2343be8285cb7f231b9/sql/core/src/test/scala/org/apache/spark/sql/errors/QueryCompilationErrorsSuite.scala#L151-L170

      The test must have a check of:

      1. the entire error message
      2. sqlState if it is defined in the error-classes.json file
      3. the error class

      Attachments

        Issue Links

          Activity

            Hi maxgekk - I would like to work on this task. Let me know if that would be okay.

            jjayadeep Jayadeep Jayaraman added a comment - Hi maxgekk - I would like to work on this task. Let me know if that would be okay.
            maxgekk Max Gekk added a comment -

            jjayadeep Sure, go ahead.

            maxgekk Max Gekk added a comment - jjayadeep Sure, go ahead.
            jjayadeep Jayadeep Jayaraman added a comment - - edited

            maxgekk - I tried creating the failure as shown below, but somehow this specific error does not show up. Can you suggest what can be the issue ?

            scala> val null_data = Seq(
                 |   (1, ("ABC",null)),
                 |   (2, ("MNO",null)),
                 |   (3, ("PQR",null))
                 |   )
            null_data: Seq[(Int, (String, Null))] = List((1,(ABC,null)), (2,(MNO,null)), (3,(PQR,null)))scala> scala>  val df = null_data.toDF()
            df: org.apache.spark.sql.DataFrame = [_1: int, _2: struct<_1: string, _2: null>]scala> df.printSchema()
            root
             |-- _1: integer (nullable = false)
             |-- _2: struct (nullable = true)
             |    |-- _1: string (nullable = true)
             |    |-- _2: null (nullable = true)
            scala>  df.withColumn("_2._2",col("_2._2").cast(IntegerType)).show()
            +---+-----------+-----+
            | _1|         _2|_2._2|
            +---+-----------+-----+
            |  1|{ABC, null}| null|
            |  2|{MNO, null}| null|
            |  3|{PQR, null}| null|
            +---+-----------+-----+
            jjayadeep Jayadeep Jayaraman added a comment - - edited maxgekk - I tried creating the failure as shown below, but somehow this specific error does not show up. Can you suggest what can be the issue ? scala> val null_data = Seq(      |   (1, ( "ABC" , null )),      |   (2, ( "MNO" , null )),      |   (3, ( "PQR" , null ))      |   ) null_data: Seq[(Int, ( String , Null))] = List((1,(ABC, null )), (2,(MNO, null )), (3,(PQR, null )))scala> scala>  val df = null_data.toDF() df: org.apache.spark.sql.DataFrame = [_1: int , _2: struct<_1: string, _2: null >]scala> df.printSchema() root  |-- _1: integer (nullable = false )  |-- _2: struct (nullable = true )  |    |-- _1: string (nullable = true )  |    |-- _2: null (nullable = true ) scala>  df.withColumn( "_2._2" ,col( "_2._2" ). cast (IntegerType)).show() +---+-----------+-----+ | _1|         _2|_2._2| +---+-----------+-----+ |  1|{ABC, null }| null | |  2|{MNO, null }| null | |  3|{PQR, null }| null | +---+-----------+-----+

            People

              Unassigned Unassigned
              maxgekk Max Gekk
              Max Gekk Max Gekk
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated: