Uploaded image for project: 'Hadoop HDFS'
  1. Hadoop HDFS
  2. HDFS-4582

TestHostsFiles fails on Windows

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 3.0.0-alpha1
    • Fix Version/s: 2.1.0-beta
    • Component/s: None
    • Labels:
      None
    • Hadoop Flags:
      Reviewed

      Description

      Test failure details:

      java.lang.AssertionError: dfshealth should contain localhost, got:<!DOCTYPE html>
      <html>
      <head>
      <link rel="stylesheet" type="text/css" href="/static/hadoop.css">
      <title>Hadoop NameNode 127.0.0.1:53373</title>
      </head>
      <body>
      <h1>NameNode '127.0.0.1:53373' (active)</h1>
      <div class='dfstable'><table>
      <tr><td class='col1'>Started:</td><td>Sun Mar 10 22:45:28 PDT 2013</td></tr>

      <tr><td class='col1'>Version:</td><td>3.0.0-SNAPSHOT, 6094bfab6459e20eba44304ffc7e65c6416dfe18</td></tr>

      <tr><td class='col1'>Compiled:</td><td>2013-03-11T05:42Z by ivanmi from trunk</td></tr>
      <tr><td class='col1'>Cluster ID:</td><td>testClusterID</td></tr>
      <tr><td class='col1'>Block Pool ID:</td><td>BP-549950874-10.120.2.171-1362980728518</td></tr>
      </table></div><br />
      <b><a href="/nn_browsedfscontent.jsp">Browse the filesystem</a></b><br>
      <b><a href="/logs/">NameNode Logs</a></b>

      <hr>
      <h3>Cluster Summary</h3>
      <b> <div class="security">Security is <em>OFF</em></div></b>
      <b> </b>
      <b> <div>2 files and directories, 1 blocks = 3 total filesystem objects.</div><div>Heap Memory used 90.61 MB is 49% of Commited Heap Memory 183.81 MB. Max Heap Memory is 2.66 GB. </div><div>Non Heap Memory used 36.67 MB is 96% of Commited Non Heap Memory 37.81 MB. Max Non Heap Memory is 130 MB.</div></b>
      <div class="dfstable"> <table>
      <tr class="rowAlt"> <td id="col1"> Configured Capacity<td id="col2"> :<td id="col3"> 670.73 GB<tr class="rowNormal"> <td id="col1"> DFS Used<td id="col2"> :<td id="col3"> 1.09 KB<tr class="rowAlt"> <td id="col1"> Non DFS Used<td id="col2"> :<td id="col3"> 513.45 GB<tr class="rowNormal"> <td id="col1"> DFS Remaining<td id="col2"> :<td id="col3"> 157.28 GB<tr class="rowAlt"> <td id="col1"> DFS Used%<td id="col2"> :<td id="col3"> 0.00%<tr class="rowNormal"> <td id="col1"> DFS Remaining%<td id="col2"> :<td id="col3"> 23.45%<tr class="rowAlt"> <td id="col1"> Block Pool Used<td id="col2"> :<td id="col3"> 1.09 KB<tr class="rowNormal"> <td id="col1"> Block Pool Used%<td id="col2"> :<td id="col3"> 0.00%<tr class="rowAlt"> <td id="col1"> DataNodes usages<td id="col2"> :<td id="col3"> Min %<td id="col4"> Median %<td id="col5"> Max %<td id="col6"> stdev %<tr class="rowNormal"> <td id="col1"> <td id="col2"> <td id="col3"> 0.00%<td id="col4"> 0.00%<td id="col5"> 0.00%<td id="col6"> 0.00%<tr class="rowAlt"> <td id="col1"> <a href="dfsnodelist.jsp?whatNodes=LIVE">Live Nodes</a> <td id="col2"> :<td id="col3"> 4 (Decommissioned: 1)<tr class="rowNormal"> <td id="col1"> <a href="dfsnodelist.jsp?whatNodes=DEAD">Dead Nodes</a> <td id="col2"> :<td id="col3"> 0 (Decommissioned: 0)<tr class="rowAlt"> <td id="col1"> <a href="dfsnodelist.jsp?whatNodes=DECOMMISSIONING">Decommissioning Nodes</a> <td id="col2"> :<td id="col3"> 0<tr class="rowNormal"> <td id="col1" title="Excludes missing blocks."> Number of Under-Replicated Blocks<td id="col2"> :<td id="col3"> 0</table></div><br>
      <h3> NameNode Journal Status: </h3>
      <b>Current transaction ID:</b> 6<br/>
      <div class="dfstable">
      <table class="storage" title="NameNode Journals">
      <thead><tr><td><b>Journal Manager</b></td><td><b>State</b></td></tr></thead>
      <tr><td>FileJournalManager(root=I:\svn\tr\hadoop-common-project\hadoop-common\target\test\data\dfs\name1)</td><td>EditLogFileOutputStream(I:\svn\tr\hadoop-common-project\hadoop-common\target\test\data\dfs\name1\current\edits_inprogress_0000000000000000001)
      </td></tr>
      <tr><td>FileJournalManager(root=I:\svn\tr\hadoop-common-project\hadoop-common\target\test\data\dfs\name2)</td><td>EditLogFileOutputStream(I:\svn\tr\hadoop-common-project\hadoop-common\target\test\data\dfs\name2\current\edits_inprogress_0000000000000000001)
      </td></tr>
      </table></div>
      <hr/>
      <h3> NameNode Storage: </h3><div class="dfstable"> <table class="storage" title="NameNode Storage">
      <thead><tr><td><b>Storage Directory</b></td><td><b>Type</b></td><td><b>State</b></td></tr></thead><tr><td>I:\svn\tr\hadoop-common-project\hadoop-common\target\test\data\dfs\name1</td><td>IMAGE_AND_EDITS</td><td>Active</td></tr><tr><td>I:\svn\tr\hadoop-common-project\hadoop-common\target\test\data\dfs\name2</td><td>IMAGE_AND_EDITS</td><td>Active</td></tr></table></div>
      <hr>
      <hr />
      <a href='http://hadoop.apache.org/core'>Hadoop</a>, 2013.
      </body></html>

      at org.junit.Assert.fail(Assert.java:91)
      at org.junit.Assert.assertTrue(Assert.java:43)
      at org.apache.hadoop.hdfs.server.namenode.TestHostsFiles.testHostsExcludeDfshealthJsp(TestHostsFiles.java:127)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
      at java.lang.reflect.Method.invoke(Method.java:597)
      at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
      at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
      at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
      at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
      at org.junit.runners.BlockJUnit4ClassRunner.runNotIgnored(BlockJUnit4ClassRunner.java:79)
      at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:71)
      at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:49)
      at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
      at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
      at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
      at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
      at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
      at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
      at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:50)
      at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
      at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467)
      at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683)
      at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390)
      at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197)

        Attachments

        1. HDFS-4582.trunk.patch
          1 kB
          Ivan Mitic

          Issue Links

            Activity

              People

              • Assignee:
                ivanmi Ivan Mitic
                Reporter:
                ivanmi Ivan Mitic
              • Votes:
                0 Vote for this issue
                Watchers:
                4 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: