Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-32837

Leverage pip and setup.py configurations to pass Hadoop and Hive options in pip installation

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 3.1.0
    • None
    • Build, PySpark
    • None

    Description

      Currently, pip itself allows a custom option to set via --install-option. However, when you pass this option to pip, it passes that option to all dependency installation. Please also see https://github.com/pypa/pip/issues/1883

      It is very hacky or impossible to use this option to pass the option to a specific package. They are discussing to have a more general option.

      Once they have a way to do it, we should remove the environment variables from setup.py and switch it to proper options.

      Attachments

        Activity

          People

            Unassigned Unassigned
            gurwls223 Hyukjin Kwon
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated: