Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-5810

Maven Coordinate Inclusion failing in pySpark

Details

    • Bug
    • Status: Resolved
    • Blocker
    • Resolution: Fixed
    • 1.3.0
    • 1.3.0
    • Deploy, PySpark
    • None

    Description

      When including maven coordinates to download dependencies in pyspark, pyspark returns a GatewayError, because it cannot read the proper port to communicate with the JVM. This is because pyspark relies on STDIN to read the port number and in the meantime Ivy prints out a whole lot of logs.

      Attachments

        Issue Links

          Activity

            joshrosen Josh Rosen added a comment -

            I think that this should be fixed now that my patch for SPARK-2313 has been merged. brkyvz, do you think we should add a regression test for this bug? Do you have tests for Maven coordinate inclusion?

            joshrosen Josh Rosen added a comment - I think that this should be fixed now that my patch for SPARK-2313 has been merged. brkyvz , do you think we should add a regression test for this bug? Do you have tests for Maven coordinate inclusion?
            brkyvz Burak Yavuz added a comment -

            Makes sense to add a regression test. I'll add it with the documentation PR which I'll submit today. I'll ping you on that one so that you can take a look.

            brkyvz Burak Yavuz added a comment - Makes sense to add a regression test. I'll add it with the documentation PR which I'll submit today. I'll ping you on that one so that you can take a look.
            brkyvz Burak Yavuz added a comment -

            Fixed with SPARK-5811 & SPARK-2313

            brkyvz Burak Yavuz added a comment - Fixed with SPARK-5811 & SPARK-2313

            People

              joshrosen Josh Rosen
              brkyvz Burak Yavuz
              Josh Rosen Josh Rosen
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: