Uploaded image for project: 'Hadoop Map/Reduce'
  1. Hadoop Map/Reduce
  2. MAPREDUCE-4018

including multiple user jar files into class path is not working

Add voteVotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 1.0.0
    • None
    • distributed-cache
    • None
    • submit job from Windows to Linux

    • libjars DistributedCache

    Description

      I am trying to submit a job which is using some third-party libs. I tried both -libjars and DistributedCache directly. Both are not working until I merge all jar files into one big jar file.

      The way I am using -libjars: -libjars <path>/file1.jar,<path>/file2.jar,....
      The way I am using DistributedCache:
      FileSystem hdfs = FileSystem.get(new URI("hdfs://remotehost:9000"), conf);
      DistributedCache.addArchiveToClassPath(new Path("<path>/file1.jar"), conf, hdfs);
      DistributedCache.addArchiveToClassPath(new Path("<path>/file2.jar"), conf, hdfs);

      The error messages I observed would have been changing if I have different number of jars in the path. It looks like only one of the jars would be in the classpath for any submission. So I decided to merge all jar files into one big jar file and try again. Now it works. It looks like there is a bug to include multiple jar files into classpath.

      Attachments

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            Unassigned Unassigned
            wshao518 Weili Shao

            Dates

              Created:
              Updated:

              Slack

                Issue deployment