Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-47311

Suppress Python exceptions where PySpark is not in the Python path

    XMLWordPrintableJSON

Details

    Description

      scala> sql("create table t(i int) using json")
      24/03/06 16:09:44 WARN DataSourceManager: Skipping the lookup of Python Data Sources due to the failure.
      org.apache.spark.SparkException:
      Error from python worker:
        /opt/homebrew/Caskroom/miniconda/base/bin/python3: Error while finding module specification for 'pyspark.daemon' (ModuleNotFoundError: No module named 'pyspark')
      

      When PySpark is not in the Python path at all, it always shows this warning message once for every Spark session initialization.

      Attachments

        Issue Links

          Activity

            People

              gurwls223 Hyukjin Kwon
              gurwls223 Hyukjin Kwon
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: