Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-8496

FsShell is broken with s3 filesystems

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Critical
    • Resolution: Duplicate
    • 2.0.0-alpha
    • None
    • fs/s3
    • None

    Description

      After setting up a S3 account, configuring the site.xml with the accesskey/password, when doing an ls on a non-empty bucket I get:

      Found 4 items
      -ls: -0s
      Usage: hadoop fs [generic options] -ls [-d] [-h] [-R] [<path> ...]
      

      Note that it correctly shows the number of items in the root of the bucket, it does not show the contents of the root.

      I've tried -get and -put and it works fine, accessing a folder in the bucket seems to be fully broken.

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              tucu00 Alejandro Abdelnur
              Votes:
              0 Vote for this issue
              Watchers:
              6 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: