Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-32780

Fill since fields for all the expressions

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • 3.1.0
    • None
    • SQL

    Description

      Some since files in ExpressionDescription are missing now, it is worth filling them to make documents better;

        test("Since has a valid value") {
          val badExpressions = spark.sessionState.functionRegistry.listFunction()
            .map(spark.sessionState.catalog.lookupFunctionInfo)
            .filter(funcInfo => !funcInfo.getSince.matches("[0-9]+\\.[0-9]+\\.[0-9]+"))
            .map(_.getClassName)
            .distinct
            .sorted
      
          if (badExpressions.nonEmpty) {
            fail(s"${badExpressions.length} expressions with invalid 'since':\n"
              + badExpressions.mkString("\n"))
          }
        }
      [info] - Since has a valid value *** FAILED *** (16 milliseconds)
      [info]   67 expressions with invalid 'since':
      [info]   org.apache.spark.sql.catalyst.expressions.Abs
      [info]   org.apache.spark.sql.catalyst.expressions.Add
      [info]   org.apache.spark.sql.catalyst.expressions.And
      [info]   org.apache.spark.sql.catalyst.expressions.ArrayContains
      [info]   org.apache.spark.sql.catalyst.expressions.AssertTrue
      [info]   org.apache.spark.sql.catalyst.expressions.BitwiseAnd
      [info]   org.apache.spark.sql.catalyst.expressions.BitwiseNot
      [info]   org.apache.spark.sql.catalyst.expressions.BitwiseOr
      [info]   org.apache.spark.sql.catalyst.expressions.BitwiseXor
      [info]   org.apache.spark.sql.catalyst.expressions.CallMethodViaReflection
      [info]   org.apache.spark.sql.catalyst.expressions.CaseWhen
      [info]   org.apache.spark.sql.catalyst.expressions.Cast
      [info]   org.apache.spark.sql.catalyst.expressions.Concat
      [info]   org.apache.spark.sql.catalyst.expressions.Crc32
      [info]   org.apache.spark.sql.catalyst.expressions.CreateArray
      [info]   org.apache.spark.sql.catalyst.expressions.CreateMap
      [info]   org.apache.spark.sql.catalyst.expressions.CreateNamedStruct
      [info]   org.apache.spark.sql.catalyst.expressions.CurrentDatabase
      [info]   org.apache.spark.sql.catalyst.expressions.Divide
      [info]   org.apache.spark.sql.catalyst.expressions.EqualNullSafe
      [info]   org.apache.spark.sql.catalyst.expressions.EqualTo
      [info]   org.apache.spark.sql.catalyst.expressions.Explode
      [info]   org.apache.spark.sql.catalyst.expressions.GetJsonObject
      [info]   org.apache.spark.sql.catalyst.expressions.GreaterThan
      [info]   org.apache.spark.sql.catalyst.expressions.GreaterThanOrEqual
      [info]   org.apache.spark.sql.catalyst.expressions.Greatest
      [info]   org.apache.spark.sql.catalyst.expressions.If
      [info]   org.apache.spark.sql.catalyst.expressions.In
      [info]   org.apache.spark.sql.catalyst.expressions.Inline
      [info]   org.apache.spark.sql.catalyst.expressions.InputFileBlockLength
      [info]   org.apache.spark.sql.catalyst.expressions.InputFileBlockStart
      [info]   org.apache.spark.sql.catalyst.expressions.InputFileName
      [info]   org.apache.spark.sql.catalyst.expressions.JsonTuple
      [info]   org.apache.spark.sql.catalyst.expressions.Least
      [info]   org.apache.spark.sql.catalyst.expressions.LessThan
      [info]   org.apache.spark.sql.catalyst.expressions.LessThanOrEqual
      [info]   org.apache.spark.sql.catalyst.expressions.MapKeys
      [info]   org.apache.spark.sql.catalyst.expressions.MapValues
      [info]   org.apache.spark.sql.catalyst.expressions.Md5
      [info]   org.apache.spark.sql.catalyst.expressions.MonotonicallyIncreasingID
      [info]   org.apache.spark.sql.catalyst.expressions.Multiply
      [info]   org.apache.spark.sql.catalyst.expressions.Murmur3Hash
      [info]   org.apache.spark.sql.catalyst.expressions.Not
      [info]   org.apache.spark.sql.catalyst.expressions.Or
      [info]   org.apache.spark.sql.catalyst.expressions.Overlay
      [info]   org.apache.spark.sql.catalyst.expressions.Pmod
      [info]   org.apache.spark.sql.catalyst.expressions.PosExplode
      [info]   org.apache.spark.sql.catalyst.expressions.Remainder
      [info]   org.apache.spark.sql.catalyst.expressions.Sha1
      [info]   org.apache.spark.sql.catalyst.expressions.Sha2
      [info]   org.apache.spark.sql.catalyst.expressions.Size
      [info]   org.apache.spark.sql.catalyst.expressions.SortArray
      [info]   org.apache.spark.sql.catalyst.expressions.SparkPartitionID
      [info]   org.apache.spark.sql.catalyst.expressions.Stack
      [info]   org.apache.spark.sql.catalyst.expressions.Subtract
      [info]   org.apache.spark.sql.catalyst.expressions.TimeWindow
      [info]   org.apache.spark.sql.catalyst.expressions.UnaryMinus
      [info]   org.apache.spark.sql.catalyst.expressions.UnaryPositive
      [info]   org.apache.spark.sql.catalyst.expressions.Uuid
      [info]   org.apache.spark.sql.catalyst.expressions.xml.XPathBoolean
      [info]   org.apache.spark.sql.catalyst.expressions.xml.XPathDouble
      [info]   org.apache.spark.sql.catalyst.expressions.xml.XPathFloat
      [info]   org.apache.spark.sql.catalyst.expressions.xml.XPathInt
      [info]   org.apache.spark.sql.catalyst.expressions.xml.XPathList
      [info]   org.apache.spark.sql.catalyst.expressions.xml.XPathLong
      [info]   org.apache.spark.sql.catalyst.expressions.xml.XPathShort
      [info]   org.apache.spark.sql.catalyst.expressions.xml.XPathString (ExpressionInfoSuite.scala:204)
      

      This was checked by tanelk: https://github.com/apache/spark/pull/29577#discussion_r479794502

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              maropu Takeshi Yamamuro
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: