Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-292

hadoop dfs commands should not output superfluous data to stdout

Log workAgile BoardRank to TopRank to BottomAttach filesAttach ScreenshotVotersWatch issueWatchersCreate sub-taskConvert to sub-taskMoveLinkCloneLabelsUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Minor
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 0.3.2
    • Component/s: None
    • Labels:
      None

      Description

      running a command such as hadoop dfs -ls /data
      produces output such as the following:
      06/06/08 17:42:32 INFO conf.Configuration: parsing jar:file: /hadoop/hadoop-0.4-dev/hadoop-0.4-dev.jar!/hadoop-default.xml
      06/06/08 17:42:32 INFO conf.Configuration: parsing file:hadoop/hadoop-site.xml
      06/06/08 17:42:32 INFO dfs.DistributedFileSystem: No FS indicated, using default:kry1200:8020
      06/06/08 17:42:32 INFO ipc.Client: Client connection to 172.30.111.134:8020: starting
      Found 2 items
      /data/a <dir>
      /data/b <dir>

      the first few lines shouldn't be there.
      it's especially annoying when running -cat into a file or into some post processing program, but in general, the output should be clean.

        Attachments

          Activity

          $i18n.getText('security.level.explanation', $currentSelection) Viewable by All Users
          Cancel

            People

            • Assignee:
              omalley Owen O'Malley Assign to me
              Reporter:
              yarnon Yoram Arnon

              Dates

              • Created:
                Updated:
                Resolved:

                Issue deployment