Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-1169

CopyFiles skips src files of s3 urls

Log workAgile BoardRank to TopRank to BottomAttach filesAttach ScreenshotBulk Copy AttachmentsBulk Move AttachmentsVotersWatch issueWatchersCreate sub-taskConvert to sub-taskMoveLinkCloneLabelsUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Minor
    • Resolution: Fixed
    • 0.12.2
    • 0.13.0
    • util
    • None

    Description

      When given a source file of items to copy, the CopyFiles utility tries each of its supported schemes in turn looking for matching entries in the passed file. Of the 4 schemes – file, hdfs, http, and s3 – we never make it to the s3 test. We skip out early. Also, even if we did make it to the search-for-s3-URLs code, the list of s3 srcPaths would always be empty because of a copy/paste error.

      Attachments

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            Unassigned Unassigned Assign to me
            stack Michael Stack
            Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment