Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-45981

Improve Python language test coverage

    XMLWordPrintableJSON

Details

    Description

      This umbrella Jira aims to improve Apache Spark 4 test coverage across various Python language versions .

      • PySpark
      • Spark Connect Python Client

      Attachments

        Issue Links

          1.
          Add Python GitHub Action Daily Job Sub-task Resolved Dongjoon Hyun
          2.
          Add `Python 3.10` to Infra docker image Sub-task Resolved Dongjoon Hyun
          3.
          Add `Python 3.10` to the Daily Python Github Action job Sub-task Resolved Dongjoon Hyun
          4.
          Add `Python 3.11` to Infra docker image Sub-task Resolved Dongjoon Hyun
          5.
          Add `Python 3.11` to the Daily Python Github Action job Sub-task Resolved Dongjoon Hyun
          6.
          Fix `pyspark.sql.tests.connect.test_connect_basic` in Python 3.11 Sub-task Resolved Dongjoon Hyun
          7.
          Upgrade `protobuf` to 4.25.1 to support `Python 3.11` Sub-task Resolved Dongjoon Hyun
          8.
          Upgrade `protobuf-java` to 3.25.1 to match with protobuf 4.25.1 Sub-task Resolved Dongjoon Hyun
          9.
          Fix `pyspark.pandas.tests.computation.test_apply_func` in Python 3.11 Sub-task Resolved Dongjoon Hyun
          10.
          Fix `pyspark.pandas.tests.connect.computation.test_parity_apply_func` in Python 3.11 Sub-task Resolved Dongjoon Hyun
          11.
          Fix `pyspark.ml.torch.tests.test_distributor` in Python 3.11 Sub-task Resolved Hyukjin Kwon
          12.
          Add `Python 3.12` to Infra docker image Sub-task Resolved Dongjoon Hyun
          13.
          Add `Python 3.12` to the Daily Python Github Action job Sub-task Resolved Hyukjin Kwon
          14.
          Upgrade `grpcio*` to 1.59.3 for Python 3.12 Sub-task Resolved Dongjoon Hyun
          15.
          Reenable a `releaseSession` test case in SparkConnectServiceE2ESuite Sub-task Resolved Hyukjin Kwon
          16.
          Install `six==1.16.0` explicitly for `pandas` in Python 3.12 Sub-task Resolved Dongjoon Hyun
          17.
          Remove `unittest` deprecated alias usage for Python 3.12 Sub-task Resolved Dongjoon Hyun
          18.
          Install `torch` nightly only at Python 3.12 in Infra docker image Sub-task Resolved Dongjoon Hyun
          19.
          Upgrade Cloudpickle to 3.0.0 Sub-task Resolved Hyukjin Kwon
          20.
          Upgrade `pytorch` for Python 3.12 Sub-task Resolved Hyukjin Kwon
          21.
          Fix the doctest in pyspark.pandas.frame.DataFrame.to_dict (Python 3.12) Sub-task Resolved Hyukjin Kwon
          22.
          Flaky `pyspark.tests.test_worker.WorkerSegfaultNonDaemonTest.test_python_segfault` with Python 3.12 Sub-task Resolved Hyukjin Kwon
          23.
          Reeanble `pyspark.tests.test_worker.WorkerSegfaultNonDaemonTest.test_python_segfault` with Python 3.12 Sub-task Open Unassigned
          24.
          Install torchvision for Python 3.12 build Sub-task Resolved Hyukjin Kwon
          25.
          Fix the doctest in pyspark.pandas.series.Series.to_dict (Python 3.12) Sub-task Resolved Hyukjin Kwon
          26.
          Fix pyspark.pandas.mlflow.load_model test (Python 3.12) Sub-task Resolved Hyukjin Kwon
          27.
          Skip `TorchDistributorLocalUnitTests.test_end_to_end_run_locally` with Python 3.12 Sub-task Resolved Hyukjin Kwon
          28.
          Skip 'CrossValidatorTests.test_crossvalidator_with_fold_col' with Python 3.12 Sub-task Resolved Hyukjin Kwon
          29.
          Split scheduled Python build Sub-task Resolved Hyukjin Kwon
          30.
          Upgrade memory-profiler>=0.61.0 for Python 3.12 Sub-task Resolved Ruifeng Zheng

          Activity

            People

              dongjoon Dongjoon Hyun
              dongjoon Dongjoon Hyun
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: