Description
Doctest in pyspark.sql.connect.column.Column.bitwiseAnd fails with the error below:
File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/column.py", line 86, in pyspark.sql.connect.column.Column.bitwiseAND Failed example: df.select(df.a.bitwiseAND(df.b)).collect() Exception raised: Traceback (most recent call last): File "/usr/local/Cellar/python@3.10/3.10.8/Frameworks/Python.framework/Versions/3.10/lib/python3.10/doctest.py", line 1350, in __run exec(compile(example.source, filename, "single", File "<doctest pyspark.sql.connect.column.Column.bitwiseAND[2]>", line 1, in <module> df.select(df.a.bitwiseAND(df.b)).collect() File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/dataframe.py", line 896, in collect pdf = self.toPandas() File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/dataframe.py", line 910, in toPandas return self._session.client._to_pandas(query) File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/client.py", line 337, in _to_pandas return self._execute_and_fetch(req) File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/client.py", line 431, in _execute_and_fetch for b in self._stub.ExecutePlan(req, metadata=self._builder.metadata()): File "/usr/local/lib/python3.10/site-packages/grpc/_channel.py", line 426, in __next__ return self._next() File "/usr/local/lib/python3.10/site-packages/grpc/_channel.py", line 826, in _next raise self grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.UNKNOWN details = "[UNRESOLVED_ROUTINE] Cannot resolve function `bitwiseAND` on search path [`system`.`builtin`, `system`.`session`, `spark_catalog`.`default`]." debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:15002 {grpc_message:"[UNRESOLVED_ROUTINE] Cannot resolve function `bitwiseAND` on search path [`system`.`builtin`, `system`.`session`, `spark_catalog`.`default`].", grpc_status:2, created_time:"2022-12-28T05:16:01.360735-08:00"}" >
We should enable this back after fixing the issue in Spark Connect