Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-41282 Feature parity: Column API in Spark Connect
  3. SPARK-41770

eqNullSafe does not support None as its argument

    XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.4.0
    • 3.4.0
    • Connect
    • None

    Description

      **********************************************************************
      File "/.../spark/python/pyspark/sql/connect/column.py", line 90, in pyspark.sql.connect.column.Column.eqNullSafe
      Failed example:
          df1.select(
              df1['value'] == 'foo',
              df1['value'].eqNullSafe('foo'),
              df1['value'].eqNullSafe(None)
          ).show()
      Exception raised:
          Traceback (most recent call last):
            File "/.../miniconda3/envs/python3.9/lib/python3.9/doctest.py", line 1336, in __run
              exec(compile(example.source, filename, "single",
            File "<doctest pyspark.sql.connect.column.Column.eqNullSafe[2]>", line 4, in <module>
              df1['value'].eqNullSafe(None)
            File "/.../workspace/forked/spark/python/pyspark/sql/connect/column.py", line 78, in wrapped
              return scalar_function(name, self, other)
            File "/.../workspace/forked/spark/python/pyspark/sql/connect/column.py", line 95, in scalar_function
              return Column(UnresolvedFunction(op, [arg._expr for arg in args]))
            File "/.../workspace/forked/spark/python/pyspark/sql/connect/column.py", line 95, in <listcomp>
              return Column(UnresolvedFunction(op, [arg._expr for arg in args]))
          AttributeError: 'NoneType' object has no attribute '_expr'
      

      Attachments

        Activity

          People

            podongfeng Ruifeng Zheng
            gurwls223 Hyukjin Kwon
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: