Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-3685

Spark's local dir should accept only local paths

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 1.1.0
    • Fix Version/s: 2.3.0
    • Component/s: Spark Core, YARN
    • Labels:
      None

      Description

      When you try to set local dirs to "hdfs:/tmp/foo" it doesn't work. What it will try to do is create a folder called "hdfs:" and put "tmp" inside it. This is because in Util#getOrCreateLocalRootDirs we use java.io.File instead of Hadoop's file system to parse this path. We also need to resolve the path appropriately.

      This may not have an urgent use case, but it fails silently and does what is least expected.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                hyukjin.kwon Hyukjin Kwon
                Reporter:
                andrewor14 Andrew Or
              • Votes:
                0 Vote for this issue
                Watchers:
                8 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: