Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-292

hadoop dfs commands should not output superfluous data to stdout

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Minor
    • Resolution: Fixed
    • None
    • 0.3.2
    • None
    • None

    Description

      running a command such as hadoop dfs -ls /data
      produces output such as the following:
      06/06/08 17:42:32 INFO conf.Configuration: parsing jar:file: /hadoop/hadoop-0.4-dev/hadoop-0.4-dev.jar!/hadoop-default.xml
      06/06/08 17:42:32 INFO conf.Configuration: parsing file:hadoop/hadoop-site.xml
      06/06/08 17:42:32 INFO dfs.DistributedFileSystem: No FS indicated, using default:kry1200:8020
      06/06/08 17:42:32 INFO ipc.Client: Client connection to 172.30.111.134:8020: starting
      Found 2 items
      /data/a <dir>
      /data/b <dir>

      the first few lines shouldn't be there.
      it's especially annoying when running -cat into a file or into some post processing program, but in general, the output should be clean.

      Attachments

        1. stderr-log.patch
          2 kB
          Owen O'Malley

        Activity

          People

            omalley Owen O'Malley
            yarnon Yoram Arnon
            Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: