Uploaded image for project: 'Hadoop Map/Reduce'
  1. Hadoop Map/Reduce
  2. MAPREDUCE-4018

including multiple user jar files into class path is not working

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Open
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: 1.0.0
    • Fix Version/s: None
    • Component/s: distributed-cache
    • Labels:
      None
    • Environment:

      submit job from Windows to Linux

    • Tags:
      libjars DistributedCache

      Description

      I am trying to submit a job which is using some third-party libs. I tried both -libjars and DistributedCache directly. Both are not working until I merge all jar files into one big jar file.

      The way I am using -libjars: -libjars <path>/file1.jar,<path>/file2.jar,....
      The way I am using DistributedCache:
      FileSystem hdfs = FileSystem.get(new URI("hdfs://remotehost:9000"), conf);
      DistributedCache.addArchiveToClassPath(new Path("<path>/file1.jar"), conf, hdfs);
      DistributedCache.addArchiveToClassPath(new Path("<path>/file2.jar"), conf, hdfs);

      The error messages I observed would have been changing if I have different number of jars in the path. It looks like only one of the jars would be in the classpath for any submission. So I decided to merge all jar files into one big jar file and try again. Now it works. It looks like there is a bug to include multiple jar files into classpath.

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              wshao518 Weili Shao
            • Votes:
              0 Vote for this issue
              Watchers:
              0 Start watching this issue

              Dates

              • Created:
                Updated: