Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-1640

TestDecommission fails on Windows

VotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Blocker
    • Resolution: Fixed
    • 0.14.0
    • 0.14.0
    • None
    • None

    Description

      In the snippet of test log below, the exception happens every ~15 milliseconds for 15 minutes until the test is timed out:

      [junit] Created file decommission.dat with 2 replicas.
      [junit] Block[0] : xxx xxx
      [junit] Block[1] : xxx xxx
      [junit] Decommissioning node: 127.0.0.1:50013
      [junit] 2007-07-19 19:12:45,059 INFO fs.FSNamesystem (FSNamesystem.java:startDecommission(2572)) - Start Decommissioning node 127.0.0.1:50013
      [junit] Name: 127.0.0.1:50013
      [junit] State : Decommission in progress
      [junit] Total raw bytes: 80030941184 (74.53 GB)
      [junit] Used raw bytes: 33940945746 (31.60 GB)
      [junit] % used: 42.40%
      [junit] Last contact: Thu Jul 19 19:12:44 PDT 2007

      [junit] Waiting for node 127.0.0.1:50013 to change state to DECOMMISSIONED
      [junit] 2007-07-19 19:12:45,199 INFO http.SocketListener (SocketListener.java:stop(212)) - Stopped SocketListener on 0.0.0.0:3147
      [junit] 2007-07-19 19:12:45,199 INFO util.Container (Container.java:stop(156)) - Stopped org.mortbay.jetty.servlet.WebApplicationHandler@1d98a
      [junit] 2007-07-19 19:12:45,293 INFO util.Container (Container.java:stop(156)) - Stopped WebApplicationContext[/,/]
      [junit] 2007-07-19 19:12:45,402 INFO util.Container (Container.java:stop(156)) - Stopped HttpContext[/logs,/logs]
      [junit] 2007-07-19 19:12:45,481 INFO util.Container (Container.java:stop(156)) - Stopped HttpContext[/static,/static]
      [junit] 2007-07-19 19:12:45,481 INFO util.Container (Container.java:stop(156)) - Stopped org.mortbay.jetty.Server@f1916f
      [junit] 2007-07-19 19:12:45,496 INFO dfs.DataNode (DataNode.java:run(692)) - Exiting DataXceiveServer due to java.net.SocketException: socket closed
      [junit] 2007-07-19 19:12:45,496 WARN dfs.DataNode (DataNode.java:offerService(568)) - java.io.IOException: java.lang.InterruptedException
      [junit] at org.apache.hadoop.fs.DF.doDF(DF.java:71)
      [junit] at org.apache.hadoop.fs.DF.getCapacity(DF.java:89)
      [junit] at org.apache.hadoop.dfs.FSDataset$FSVolume.getCapacity(FSDataset.java:292)
      [junit] at org.apache.hadoop.dfs.FSDataset$FSVolumeSet.getCapacity(FSDataset.java:379)
      [junit] at org.apache.hadoop.dfs.FSDataset.getCapacity(FSDataset.java:466)
      [junit] at org.apache.hadoop.dfs.DataNode.offerService(DataNode.java:493)
      [junit] at org.apache.hadoop.dfs.DataNode.run(DataNode.java:1306)
      [junit] at java.lang.Thread.run(Thread.java:595)

      Attachments

        1. testDecommission1640.patch
          1 kB
          Dhruba Borthakur

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            dhruba Dhruba Borthakur
            nidaley Nigel Daley
            Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment