Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-3778

newAPIHadoopRDD doesn't properly pass credentials for secure hdfs on yarn

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Blocker
    • Resolution: Fixed
    • Affects Version/s: 1.1.0
    • Fix Version/s: 1.3.0
    • Component/s: Spark Core
    • Labels:
      None
    • Target Version/s:

      Description

      The newAPIHadoopRDD routine doesn't properly add the credentials to the conf to be able to access secure hdfs.

      Note that newAPIHadoopFile does handle these because the org.apache.hadoop.mapreduce.Job automatically adds it for you.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                tgraves Thomas Graves
                Reporter:
                tgraves Thomas Graves
              • Votes:
                1 Vote for this issue
                Watchers:
                5 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: