HBase
  1. HBase
  2. HBASE-10592

Refactor PerformanceEvaluation tool

    Details

    • Type: Improvement Improvement
    • Status: Closed
    • Priority: Minor Minor
    • Resolution: Fixed
    • Affects Version/s: 0.96.2, 0.98.1, 0.99.0
    • Fix Version/s: 0.96.2, 0.98.1, 0.99.0, hbase-10070
    • Component/s: test
    • Labels:
      None
    • Hadoop Flags:
      Reviewed

      Description

      PerfEval is kind of a mess. It's painful to add new features because the test options are itemized and passed as parameters to internal methods. Serialization is hand-rolled and tedious. Ensuring support for mapreduce mode is a chore because of it.

      This patch refactors the tool. Options are now passed around to methods and such as a POJO instead of one-by-one. Get rid of accessors that don't help anyone. On the mapreduce side, serialization is now handled using json (jackson is a dependency anyway) instead of the hand-rolled regex we used before. Also do away with custom InputSplit and FileFormat, instead using Text and NLineInputFormat. On the local mode side, combine 1 client and N clients into the same implementation. That implementation now uses an ExecutorService, so we can later decouple number of client workers from number of client tasks. Finally, drop a bunch of confusing local state, instead use the new TestOptions POJO as a parameter to static methods.

      1. HBASE-10592.00.patch
        52 kB
        Nick Dimiduk
      2. HBASE-10592.00-0.98.patch
        52 kB
        Nick Dimiduk
      3. HBASE-10592.00-0.96.patch
        52 kB
        Nick Dimiduk
      4. HBASE-10592.01-0.96.patch
        53 kB
        Nick Dimiduk
      5. HBASE-10592.01-0.98.patch
        53 kB
        Nick Dimiduk
      6. HBASE-10592.01.patch
        53 kB
        Nick Dimiduk

        Issue Links

          Activity

          Hide
          Nick Dimiduk added a comment -

          The EvaluationMapTask is designed to be overridden by a user-provided class, so technically this patch breaks API compatibility. Meaning, it shouldn't be applied to 0.96 or 0.98. However, it's only a testing tool so it won't break anyone's production code. This change makes the tool much easier to enhance, as I'll demonstrate on HBASE-9953, so I'd prefer to get the change applied to the previous branches anyway. However, it's up to you stack and Andrew Purtell.

          Show
          Nick Dimiduk added a comment - The EvaluationMapTask is designed to be overridden by a user-provided class, so technically this patch breaks API compatibility. Meaning, it shouldn't be applied to 0.96 or 0.98. However, it's only a testing tool so it won't break anyone's production code. This change makes the tool much easier to enhance, as I'll demonstrate on HBASE-9953 , so I'd prefer to get the change applied to the previous branches anyway. However, it's up to you stack and Andrew Purtell .
          Hide
          Nick Dimiduk added a comment -

          Jean-Marc Spaggiari mind taking this for a spin on your real cluster? Here's a patch vs. 0.98, maybe it'll be easier to use.

          Show
          Nick Dimiduk added a comment - Jean-Marc Spaggiari mind taking this for a spin on your real cluster? Here's a patch vs. 0.98, maybe it'll be easier to use.
          Hide
          Hadoop QA added a comment -

          -1 overall. Here are the results of testing the latest attachment
          http://issues.apache.org/jira/secure/attachment/12630504/HBASE-10592.00-0.98.patch
          against trunk revision .
          ATTACHMENT ID: 12630504

          +1 @author. The patch does not contain any @author tags.

          +1 tests included. The patch appears to include 3 new or modified tests.

          -1 patch. The patch command could not apply the patch.

          Console output: https://builds.apache.org/job/PreCommit-HBASE-Build/8776//console

          This message is automatically generated.

          Show
          Hadoop QA added a comment - -1 overall . Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12630504/HBASE-10592.00-0.98.patch against trunk revision . ATTACHMENT ID: 12630504 +1 @author . The patch does not contain any @author tags. +1 tests included . The patch appears to include 3 new or modified tests. -1 patch . The patch command could not apply the patch. Console output: https://builds.apache.org/job/PreCommit-HBASE-Build/8776//console This message is automatically generated.
          Hide
          Nick Dimiduk added a comment -

          And here's a patch for 0.96. Changes touching TestHFileOutputFormat was... surprising. Should unwind the dependence in another issue.

          Show
          Nick Dimiduk added a comment - And here's a patch for 0.96. Changes touching TestHFileOutputFormat was... surprising. Should unwind the dependence in another issue.
          Hide
          Hadoop QA added a comment -

          -1 overall. Here are the results of testing the latest attachment
          http://issues.apache.org/jira/secure/attachment/12630510/HBASE-10592.00-0.96.patch
          against trunk revision .
          ATTACHMENT ID: 12630510

          +1 @author. The patch does not contain any @author tags.

          +1 tests included. The patch appears to include 9 new or modified tests.

          -1 patch. The patch command could not apply the patch.

          Console output: https://builds.apache.org/job/PreCommit-HBASE-Build/8777//console

          This message is automatically generated.

          Show
          Hadoop QA added a comment - -1 overall . Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12630510/HBASE-10592.00-0.96.patch against trunk revision . ATTACHMENT ID: 12630510 +1 @author . The patch does not contain any @author tags. +1 tests included . The patch appears to include 9 new or modified tests. -1 patch . The patch command could not apply the patch. Console output: https://builds.apache.org/job/PreCommit-HBASE-Build/8777//console This message is automatically generated.
          Hide
          stack added a comment -

          +1 on making a testing tool easier to customize. +1 on 0.96.

          Show
          stack added a comment - +1 on making a testing tool easier to customize. +1 on 0.96.
          Hide
          Jean-Marc Spaggiari added a comment -

          Will test that tomorrow for sure.

          Currently actively testing 0.94.17 so I'm delayed a bit...

          Show
          Jean-Marc Spaggiari added a comment - Will test that tomorrow for sure. Currently actively testing 0.94.17 so I'm delayed a bit...
          Hide
          Andrew Purtell added a comment -

          +1 for 0.98.

          I don't believe we should expose any kind of user facing API from code in test/. If we want to make PE something that users are encouraged to build upon, the sources should first be moved from test/ to main/, and then interface annotations should be added.

          Show
          Andrew Purtell added a comment - +1 for 0.98. I don't believe we should expose any kind of user facing API from code in test/. If we want to make PE something that users are encouraged to build upon, the sources should first be moved from test/ to main/, and then interface annotations should be added.
          Hide
          stack added a comment -

          Would it be hard to move it to main Nick?

          Show
          stack added a comment - Would it be hard to move it to main Nick?
          Hide
          Andrew Purtell added a comment -

          Would it be hard to move it to main Nick?

          +1, I'm sensing this is closer to how we want to treat PE than leaving it in test/.

          Show
          Andrew Purtell added a comment - Would it be hard to move it to main Nick? +1, I'm sensing this is closer to how we want to treat PE than leaving it in test/.
          Hide
          Nick Dimiduk added a comment -

          I can move it over, yes. Some basic e2e tests should be added as well. You want that as part of this patch, or separate issue?

          Show
          Nick Dimiduk added a comment - I can move it over, yes. Some basic e2e tests should be added as well. You want that as part of this patch, or separate issue?
          Hide
          stack added a comment -

          Nick Dimiduk Separate issue I'd say. PE should be more amenable especially if it is getting this fancy spruce-up job.

          Show
          stack added a comment - Nick Dimiduk Separate issue I'd say. PE should be more amenable especially if it is getting this fancy spruce-up job.
          Hide
          Nick Dimiduk added a comment -

          stack Very good. Opened HBASE-10610.

          Show
          Nick Dimiduk added a comment - stack Very good. Opened HBASE-10610 .
          Hide
          Nick Dimiduk added a comment -

          Tested the 0.96 patch on a 0.96.1/secure cluster, sequentialWrite and randomRead work fine in MR mode. Tested the 0.98 patch on a 0.98.0/secure cluster, but get

          2014-02-26 01:13:33,870 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.NoClassDefFoundError: org/apache/commons/math/stat/descriptive/DescriptiveStatistics
          	at org.apache.hadoop.hbase.PerformanceEvaluation$RandomReadTest.testTakedown(PerformanceEvaluation.java:781)
          	at org.apache.hadoop.hbase.PerformanceEvaluation$Test.test(PerformanceEvaluation.java:584)
          	at org.apache.hadoop.hbase.PerformanceEvaluation.runOneClient(PerformanceEvaluation.java:1017)
          	at org.apache.hadoop.hbase.PerformanceEvaluation$EvaluationMapTask.map(PerformanceEvaluation.java:236)
          	at org.apache.hadoop.hbase.PerformanceEvaluation$EvaluationMapTask.map(PerformanceEvaluation.java:187)
          	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
          	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
          	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
          	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
          	at java.security.AccessController.doPrivileged(Native Method)
          	at javax.security.auth.Subject.doAs(Subject.java:396)
          	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
          	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
          Caused by: java.lang.ClassNotFoundException: org.apache.commons.math.stat.descriptive.DescriptiveStatistics
          	at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
          	at java.security.AccessController.doPrivileged(Native Method)
          	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
          	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
          	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
          	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
          	... 13 more
          

          The following patch fixes it

          diff -u b/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java b/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java
          --- b/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java
          +++ b/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java
          @@ -392,9 +392,8 @@
               TextOutputFormat.setOutputPath(job, new Path(inputDir.getParent(), "outputs"));
           
               TableMapReduceUtil.addDependencyJars(job);
          -    // Add a Class from the hbase.jar so it gets registered too.
               TableMapReduceUtil.addDependencyJars(job.getConfiguration(),
          -      org.apache.hadoop.hbase.util.Bytes.class);
          +      DescriptiveStatistics.class);
           
               TableMapReduceUtil.initCredentials(job);
          

          I'm not sure why this is necessary on 0.98 but not 0.96.

          Show
          Nick Dimiduk added a comment - Tested the 0.96 patch on a 0.96.1/secure cluster, sequentialWrite and randomRead work fine in MR mode. Tested the 0.98 patch on a 0.98.0/secure cluster, but get 2014-02-26 01:13:33,870 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.NoClassDefFoundError: org/apache/commons/math/stat/descriptive/DescriptiveStatistics at org.apache.hadoop.hbase.PerformanceEvaluation$RandomReadTest.testTakedown(PerformanceEvaluation.java:781) at org.apache.hadoop.hbase.PerformanceEvaluation$Test.test(PerformanceEvaluation.java:584) at org.apache.hadoop.hbase.PerformanceEvaluation.runOneClient(PerformanceEvaluation.java:1017) at org.apache.hadoop.hbase.PerformanceEvaluation$EvaluationMapTask.map(PerformanceEvaluation.java:236) at org.apache.hadoop.hbase.PerformanceEvaluation$EvaluationMapTask.map(PerformanceEvaluation.java:187) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) Caused by: java.lang.ClassNotFoundException: org.apache.commons.math.stat.descriptive.DescriptiveStatistics at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:306) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at java.lang.ClassLoader.loadClass(ClassLoader.java:247) ... 13 more The following patch fixes it diff -u b/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java b/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java --- b/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java +++ b/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java @@ -392,9 +392,8 @@ TextOutputFormat.setOutputPath(job, new Path(inputDir.getParent(), "outputs")); TableMapReduceUtil.addDependencyJars(job); - // Add a Class from the hbase.jar so it gets registered too. TableMapReduceUtil.addDependencyJars(job.getConfiguration(), - org.apache.hadoop.hbase.util.Bytes.class); + DescriptiveStatistics.class); TableMapReduceUtil.initCredentials(job); I'm not sure why this is necessary on 0.98 but not 0.96.
          Hide
          Nick Dimiduk added a comment -

          commons-math3 is the only version of the library provided by hadoop classpath. I guess we need to "do the right thing" and explicitly include this jar in the job.

          Show
          Nick Dimiduk added a comment - commons-math3 is the only version of the library provided by hadoop classpath . I guess we need to "do the right thing" and explicitly include this jar in the job.
          Hide
          Nick Dimiduk added a comment -

          Update the patches to do the right thing for both commons-math and my other new dependency: jackson. Interdiff looks like this

          diff -u b/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java b/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java
          --- b/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java
          +++ b/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java
          @@ -393,9 +393,9 @@
               TextOutputFormat.setOutputPath(job, new Path(inputDir.getParent(), "outputs"));
           
               TableMapReduceUtil.addDependencyJars(job);
          -    // Add a Class from the hbase.jar so it gets registered too.
               TableMapReduceUtil.addDependencyJars(job.getConfiguration(),
          -      org.apache.hadoop.hbase.util.Bytes.class);
          +      DescriptiveStatistics.class, // commons-math
          +      ObjectMapper.class);         // jackson-mapper-asl
           
               TableMapReduceUtil.initCredentials(job);
           
          
          Show
          Nick Dimiduk added a comment - Update the patches to do the right thing for both commons-math and my other new dependency: jackson. Interdiff looks like this diff -u b/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java b/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java --- b/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java +++ b/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java @@ -393,9 +393,9 @@ TextOutputFormat.setOutputPath(job, new Path(inputDir.getParent(), "outputs")); TableMapReduceUtil.addDependencyJars(job); - // Add a Class from the hbase.jar so it gets registered too. TableMapReduceUtil.addDependencyJars(job.getConfiguration(), - org.apache.hadoop.hbase.util.Bytes.class); + DescriptiveStatistics.class, // commons-math + ObjectMapper.class); // jackson-mapper-asl TableMapReduceUtil.initCredentials(job);
          Hide
          Hadoop QA added a comment -

          +1 overall. Here are the results of testing the latest attachment
          http://issues.apache.org/jira/secure/attachment/12631126/HBASE-10592.01.patch
          against trunk revision .
          ATTACHMENT ID: 12631126

          +1 @author. The patch does not contain any @author tags.

          +1 tests included. The patch appears to include 3 new or modified tests.

          +1 hadoop1.0. The patch compiles against the hadoop 1.0 profile.

          +1 hadoop1.1. The patch compiles against the hadoop 1.1 profile.

          +1 javadoc. The javadoc tool did not generate any warning messages.

          +1 javac. The applied patch does not increase the total number of javac compiler warnings.

          +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

          +1 release audit. The applied patch does not increase the total number of release audit warnings.

          +1 lineLengths. The patch does not introduce lines longer than 100

          +1 site. The mvn site goal succeeds with this patch.

          +1 core tests. The patch passed unit tests in .

          Test results: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//testReport/
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-protocol.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-thrift.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-client.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-hadoop2-compat.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-examples.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-prefix-tree.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-common.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-server.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-hadoop-compat.html
          Console output: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//console

          This message is automatically generated.

          Show
          Hadoop QA added a comment - +1 overall . Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12631126/HBASE-10592.01.patch against trunk revision . ATTACHMENT ID: 12631126 +1 @author . The patch does not contain any @author tags. +1 tests included . The patch appears to include 3 new or modified tests. +1 hadoop1.0 . The patch compiles against the hadoop 1.0 profile. +1 hadoop1.1 . The patch compiles against the hadoop 1.1 profile. +1 javadoc . The javadoc tool did not generate any warning messages. +1 javac . The applied patch does not increase the total number of javac compiler warnings. +1 findbugs . The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit . The applied patch does not increase the total number of release audit warnings. +1 lineLengths . The patch does not introduce lines longer than 100 +1 site . The mvn site goal succeeds with this patch. +1 core tests . The patch passed unit tests in . Test results: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//testReport/ Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-protocol.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-thrift.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-client.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-hadoop2-compat.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-examples.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-prefix-tree.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-common.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-server.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-hadoop-compat.html Console output: https://builds.apache.org/job/PreCommit-HBASE-Build/8809//console This message is automatically generated.
          Hide
          stack added a comment -

          Go for it

          Show
          stack added a comment - Go for it
          Hide
          Jean-Marc Spaggiari added a comment -

          Let me know when you think it will be ready for me to give it a try...

          Show
          Jean-Marc Spaggiari added a comment - Let me know when you think it will be ready for me to give it a try...
          Hide
          Nick Dimiduk added a comment -

          Per my testing, it's ready. I was just about to commit. Want to take it for a spin before I do so, Jean-Marc Spaggiari?

          Show
          Nick Dimiduk added a comment - Per my testing, it's ready. I was just about to commit. Want to take it for a spin before I do so, Jean-Marc Spaggiari ?
          Hide
          Jean-Marc Spaggiari added a comment -

          Will test it in 0.96 right now.

          Show
          Jean-Marc Spaggiari added a comment - Will test it in 0.96 right now.
          Hide
          Jean-Marc Spaggiari added a comment -

          Works well for me in 0.96
          With your patch:

          hbase@hbasetest1:~$  bin/hbase org.apache.hadoop.hbase.PerformanceEvaluation --presplit=6 --rows=2097152 randomWrite 1
          2014-02-26 15:22:28,980 INFO  [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
          2014-02-26 15:22:29,799 INFO  [main] zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x52c03865 connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181
          2014-02-26 15:22:30,360 INFO  [main] zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x5814caa8 connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181
          2014-02-26 15:22:30,391 INFO  [main] zookeeper.RecoverableZooKeeper: Process identifier=catalogtracker-on-hconnection-0x5814caa8 connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181
          2014-02-26 15:22:32,137 INFO  [main] zookeeper.RecoverableZooKeeper: Process identifier=catalogtracker-on-hconnection-0x5814caa8 connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181
          2014-02-26 15:22:32,190 INFO  [main] hbase.PerformanceEvaluation: Table created with 6 splits
          2014-02-26 15:22:32,197 INFO  [main] zookeeper.RecoverableZooKeeper: Process identifier=catalogtracker-on-hconnection-0x5814caa8 connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181
          2014-02-26 15:22:32,221 INFO  [main] hbase.PerformanceEvaluation: Start class org.apache.hadoop.hbase.PerformanceEvaluation$RandomWriteTest at offset 0 for 2097152 rows
          2014-02-26 15:22:32,298 INFO  [main] hbase.PerformanceEvaluation: Timed test starting in thread main
          2014-02-26 15:22:52,933 INFO  [main] hbase.PerformanceEvaluation: 0/209715/2097152
          2014-02-26 15:23:06,678 INFO  [main] hbase.PerformanceEvaluation: 0/419430/2097152
          2014-02-26 15:23:22,174 INFO  [main] hbase.PerformanceEvaluation: 0/629145/2097152
          2014-02-26 15:23:39,249 INFO  [main] hbase.PerformanceEvaluation: 0/838860/2097152
          2014-02-26 15:23:54,392 INFO  [main] hbase.PerformanceEvaluation: 0/1048575/2097152
          2014-02-26 15:24:07,971 INFO  [main] hbase.PerformanceEvaluation: 0/1258290/2097152
          2014-02-26 15:24:26,551 INFO  [main] hbase.PerformanceEvaluation: 0/1468005/2097152
          2014-02-26 15:24:38,495 INFO  [main] hbase.PerformanceEvaluation: 0/1677720/2097152
          2014-02-26 15:24:51,138 INFO  [main] hbase.PerformanceEvaluation: 0/1887435/2097152
          2014-02-26 15:25:09,124 INFO  [main] hbase.PerformanceEvaluation: 0/2097150/2097152
          2014-02-26 15:25:09,360 INFO  [main] hbase.PerformanceEvaluation: Finished class org.apache.hadoop.hbase.PerformanceEvaluation$RandomWriteTest in 157058ms at offset 0 for 2097152 rows (13,17 MB/s)
          2014-02-26 15:25:09,360 INFO  [main] client.HConnectionManager$HConnectionImplementation: Closing master protocol: MasterService
          2014-02-26 15:25:09,360 INFO  [main] client.HConnectionManager$HConnectionImplementation: Closing zookeeper sessionid=0x14426a37e621805
          

          Without the patch

          hbase@hbasetest1:~$ bin/hbase org.apache.hadoop.hbase.PerformanceEvaluation --presplit=6 --rows=2097152 randomWrite 1
          2014-02-26 15:29:57,987 INFO  [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
          2014-02-26 15:29:58,663 WARN  [main] util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
          2014-02-26 15:29:58,832 INFO  [main] zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x545aae15 connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181
          2014-02-26 15:29:59,400 INFO  [main] zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x4f7540bd connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181
          2014-02-26 15:29:59,436 INFO  [main] zookeeper.RecoverableZooKeeper: Process identifier=catalogtracker-on-hconnection-0x4f7540bd connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181
          2014-02-26 15:30:00,665 INFO  [main] zookeeper.RecoverableZooKeeper: Process identifier=catalogtracker-on-hconnection-0x4f7540bd connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181
          2014-02-26 15:30:00,706 INFO  [main] hbase.PerformanceEvaluation: Table created with 6 splits
          2014-02-26 15:30:00,719 INFO  [main] zookeeper.RecoverableZooKeeper: Process identifier=catalogtracker-on-hconnection-0x4f7540bd connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181
          2014-02-26 15:30:00,765 INFO  [main] hbase.PerformanceEvaluation: Start class org.apache.hadoop.hbase.PerformanceEvaluation$RandomWriteTest at offset 0 for 2097152 rows
          2014-02-26 15:30:00,827 INFO  [main] hbase.PerformanceEvaluation: Timed test starting in thread main
          2014-02-26 15:30:20,306 INFO  [main] hbase.PerformanceEvaluation: 0/209715/2097152
          2014-02-26 15:30:34,670 INFO  [main] hbase.PerformanceEvaluation: 0/419430/2097152
          2014-02-26 15:30:48,351 INFO  [main] hbase.PerformanceEvaluation: 0/629145/2097152
          2014-02-26 15:31:04,139 INFO  [main] hbase.PerformanceEvaluation: 0/838860/2097152
          2014-02-26 15:31:19,112 INFO  [main] hbase.PerformanceEvaluation: 0/1048575/2097152
          2014-02-26 15:31:31,711 INFO  [main] hbase.PerformanceEvaluation: 0/1258290/2097152
          2014-02-26 15:31:50,578 INFO  [main] hbase.PerformanceEvaluation: 0/1468005/2097152
          2014-02-26 15:32:03,861 INFO  [main] hbase.PerformanceEvaluation: 0/1677720/2097152
          2014-02-26 15:32:16,284 INFO  [main] hbase.PerformanceEvaluation: 0/1887435/2097152
          2014-02-26 15:32:32,913 INFO  [main] hbase.PerformanceEvaluation: 0/2097150/2097152
          2014-02-26 15:32:33,245 INFO  [main] hbase.PerformanceEvaluation: Finished class org.apache.hadoop.hbase.PerformanceEvaluation$RandomWriteTest in 152415ms at offset 0 for 2097152 rows (13,57 MB/s)
          2014-02-26 15:32:33,245 INFO  [main] client.HConnectionManager$HConnectionImplementation: Closing master protocol: MasterService
          2014-02-26 15:32:33,246 INFO  [main] client.HConnectionManager$HConnectionImplementation: Closing zookeeper sessionid=0x14426a37e621818
          

          Similar output, similar result. Looked at the code too.
          +1

          Show
          Jean-Marc Spaggiari added a comment - Works well for me in 0.96 With your patch: hbase@hbasetest1:~$ bin/hbase org.apache.hadoop.hbase.PerformanceEvaluation --presplit=6 --rows=2097152 randomWrite 1 2014-02-26 15:22:28,980 INFO [main] Configuration.deprecation: hadoop. native .lib is deprecated. Instead, use io. native .lib.available 2014-02-26 15:22:29,799 INFO [main] zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x52c03865 connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181 2014-02-26 15:22:30,360 INFO [main] zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x5814caa8 connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181 2014-02-26 15:22:30,391 INFO [main] zookeeper.RecoverableZooKeeper: Process identifier=catalogtracker-on-hconnection-0x5814caa8 connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181 2014-02-26 15:22:32,137 INFO [main] zookeeper.RecoverableZooKeeper: Process identifier=catalogtracker-on-hconnection-0x5814caa8 connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181 2014-02-26 15:22:32,190 INFO [main] hbase.PerformanceEvaluation: Table created with 6 splits 2014-02-26 15:22:32,197 INFO [main] zookeeper.RecoverableZooKeeper: Process identifier=catalogtracker-on-hconnection-0x5814caa8 connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181 2014-02-26 15:22:32,221 INFO [main] hbase.PerformanceEvaluation: Start class org.apache.hadoop.hbase.PerformanceEvaluation$RandomWriteTest at offset 0 for 2097152 rows 2014-02-26 15:22:32,298 INFO [main] hbase.PerformanceEvaluation: Timed test starting in thread main 2014-02-26 15:22:52,933 INFO [main] hbase.PerformanceEvaluation: 0/209715/2097152 2014-02-26 15:23:06,678 INFO [main] hbase.PerformanceEvaluation: 0/419430/2097152 2014-02-26 15:23:22,174 INFO [main] hbase.PerformanceEvaluation: 0/629145/2097152 2014-02-26 15:23:39,249 INFO [main] hbase.PerformanceEvaluation: 0/838860/2097152 2014-02-26 15:23:54,392 INFO [main] hbase.PerformanceEvaluation: 0/1048575/2097152 2014-02-26 15:24:07,971 INFO [main] hbase.PerformanceEvaluation: 0/1258290/2097152 2014-02-26 15:24:26,551 INFO [main] hbase.PerformanceEvaluation: 0/1468005/2097152 2014-02-26 15:24:38,495 INFO [main] hbase.PerformanceEvaluation: 0/1677720/2097152 2014-02-26 15:24:51,138 INFO [main] hbase.PerformanceEvaluation: 0/1887435/2097152 2014-02-26 15:25:09,124 INFO [main] hbase.PerformanceEvaluation: 0/2097150/2097152 2014-02-26 15:25:09,360 INFO [main] hbase.PerformanceEvaluation: Finished class org.apache.hadoop.hbase.PerformanceEvaluation$RandomWriteTest in 157058ms at offset 0 for 2097152 rows (13,17 MB/s) 2014-02-26 15:25:09,360 INFO [main] client.HConnectionManager$HConnectionImplementation: Closing master protocol: MasterService 2014-02-26 15:25:09,360 INFO [main] client.HConnectionManager$HConnectionImplementation: Closing zookeeper sessionid=0x14426a37e621805 Without the patch hbase@hbasetest1:~$ bin/hbase org.apache.hadoop.hbase.PerformanceEvaluation --presplit=6 --rows=2097152 randomWrite 1 2014-02-26 15:29:57,987 INFO [main] Configuration.deprecation: hadoop. native .lib is deprecated. Instead, use io. native .lib.available 2014-02-26 15:29:58,663 WARN [main] util.NativeCodeLoader: Unable to load native -hadoop library for your platform... using builtin-java classes where applicable 2014-02-26 15:29:58,832 INFO [main] zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x545aae15 connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181 2014-02-26 15:29:59,400 INFO [main] zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x4f7540bd connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181 2014-02-26 15:29:59,436 INFO [main] zookeeper.RecoverableZooKeeper: Process identifier=catalogtracker-on-hconnection-0x4f7540bd connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181 2014-02-26 15:30:00,665 INFO [main] zookeeper.RecoverableZooKeeper: Process identifier=catalogtracker-on-hconnection-0x4f7540bd connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181 2014-02-26 15:30:00,706 INFO [main] hbase.PerformanceEvaluation: Table created with 6 splits 2014-02-26 15:30:00,719 INFO [main] zookeeper.RecoverableZooKeeper: Process identifier=catalogtracker-on-hconnection-0x4f7540bd connecting to ZooKeeper ensemble=hbasetest1.distparser.com:2181 2014-02-26 15:30:00,765 INFO [main] hbase.PerformanceEvaluation: Start class org.apache.hadoop.hbase.PerformanceEvaluation$RandomWriteTest at offset 0 for 2097152 rows 2014-02-26 15:30:00,827 INFO [main] hbase.PerformanceEvaluation: Timed test starting in thread main 2014-02-26 15:30:20,306 INFO [main] hbase.PerformanceEvaluation: 0/209715/2097152 2014-02-26 15:30:34,670 INFO [main] hbase.PerformanceEvaluation: 0/419430/2097152 2014-02-26 15:30:48,351 INFO [main] hbase.PerformanceEvaluation: 0/629145/2097152 2014-02-26 15:31:04,139 INFO [main] hbase.PerformanceEvaluation: 0/838860/2097152 2014-02-26 15:31:19,112 INFO [main] hbase.PerformanceEvaluation: 0/1048575/2097152 2014-02-26 15:31:31,711 INFO [main] hbase.PerformanceEvaluation: 0/1258290/2097152 2014-02-26 15:31:50,578 INFO [main] hbase.PerformanceEvaluation: 0/1468005/2097152 2014-02-26 15:32:03,861 INFO [main] hbase.PerformanceEvaluation: 0/1677720/2097152 2014-02-26 15:32:16,284 INFO [main] hbase.PerformanceEvaluation: 0/1887435/2097152 2014-02-26 15:32:32,913 INFO [main] hbase.PerformanceEvaluation: 0/2097150/2097152 2014-02-26 15:32:33,245 INFO [main] hbase.PerformanceEvaluation: Finished class org.apache.hadoop.hbase.PerformanceEvaluation$RandomWriteTest in 152415ms at offset 0 for 2097152 rows (13,57 MB/s) 2014-02-26 15:32:33,245 INFO [main] client.HConnectionManager$HConnectionImplementation: Closing master protocol: MasterService 2014-02-26 15:32:33,246 INFO [main] client.HConnectionManager$HConnectionImplementation: Closing zookeeper sessionid=0x14426a37e621818 Similar output, similar result. Looked at the code too. +1
          Hide
          Nick Dimiduk added a comment -

          Jean-Marc Spaggiari thanks for taking it for a spin!

          I've committed to all three branches. Thanks for the reviews folks.

          Show
          Nick Dimiduk added a comment - Jean-Marc Spaggiari thanks for taking it for a spin! I've committed to all three branches. Thanks for the reviews folks.
          Hide
          Hudson added a comment -

          FAILURE: Integrated in HBase-TRUNK #4960 (See https://builds.apache.org/job/HBase-TRUNK/4960/)
          HBASE-10592 Refactor PerformanceEvaluation tool (ndimiduk: rev 1572286)

          • /hbase/trunk/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java
          Show
          Hudson added a comment - FAILURE: Integrated in HBase-TRUNK #4960 (See https://builds.apache.org/job/HBase-TRUNK/4960/ ) HBASE-10592 Refactor PerformanceEvaluation tool (ndimiduk: rev 1572286) /hbase/trunk/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java
          Hide
          Hudson added a comment -

          SUCCESS: Integrated in hbase-0.96-hadoop2 #218 (See https://builds.apache.org/job/hbase-0.96-hadoop2/218/)
          HBASE-10592 Refactor PerformanceEvaluation tool (ndimiduk: rev 1572293)

          • /hbase/branches/0.96/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java
          • /hbase/branches/0.96/hbase-server/src/test/java/org/apache/hadoop/hbase/mapreduce/TestHFileOutputFormat.java
          • /hbase/branches/0.96/hbase-server/src/test/java/org/apache/hadoop/hbase/mapreduce/TestHFileOutputFormat2.java
          Show
          Hudson added a comment - SUCCESS: Integrated in hbase-0.96-hadoop2 #218 (See https://builds.apache.org/job/hbase-0.96-hadoop2/218/ ) HBASE-10592 Refactor PerformanceEvaluation tool (ndimiduk: rev 1572293) /hbase/branches/0.96/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java /hbase/branches/0.96/hbase-server/src/test/java/org/apache/hadoop/hbase/mapreduce/TestHFileOutputFormat.java /hbase/branches/0.96/hbase-server/src/test/java/org/apache/hadoop/hbase/mapreduce/TestHFileOutputFormat2.java
          Hide
          Hudson added a comment -

          FAILURE: Integrated in HBase-TRUNK-on-Hadoop-1.1 #100 (See https://builds.apache.org/job/HBase-TRUNK-on-Hadoop-1.1/100/)
          HBASE-10592 Refactor PerformanceEvaluation tool (ndimiduk: rev 1572286)

          • /hbase/trunk/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java
          Show
          Hudson added a comment - FAILURE: Integrated in HBase-TRUNK-on-Hadoop-1.1 #100 (See https://builds.apache.org/job/HBase-TRUNK-on-Hadoop-1.1/100/ ) HBASE-10592 Refactor PerformanceEvaluation tool (ndimiduk: rev 1572286) /hbase/trunk/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java
          Hide
          Hudson added a comment -

          SUCCESS: Integrated in HBase-0.98 #188 (See https://builds.apache.org/job/HBase-0.98/188/)
          HBASE-10592 Refactor PerformanceEvaluation tool (ndimiduk: rev 1572289)

          • /hbase/branches/0.98/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java
          Show
          Hudson added a comment - SUCCESS: Integrated in HBase-0.98 #188 (See https://builds.apache.org/job/HBase-0.98/188/ ) HBASE-10592 Refactor PerformanceEvaluation tool (ndimiduk: rev 1572289) /hbase/branches/0.98/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java
          Hide
          Hudson added a comment -

          FAILURE: Integrated in HBase-0.98-on-Hadoop-1.1 #176 (See https://builds.apache.org/job/HBase-0.98-on-Hadoop-1.1/176/)
          HBASE-10592 Refactor PerformanceEvaluation tool (ndimiduk: rev 1572289)

          • /hbase/branches/0.98/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java
          Show
          Hudson added a comment - FAILURE: Integrated in HBase-0.98-on-Hadoop-1.1 #176 (See https://builds.apache.org/job/HBase-0.98-on-Hadoop-1.1/176/ ) HBASE-10592 Refactor PerformanceEvaluation tool (ndimiduk: rev 1572289) /hbase/branches/0.98/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java
          Hide
          Hudson added a comment -

          FAILURE: Integrated in hbase-0.96 #318 (See https://builds.apache.org/job/hbase-0.96/318/)
          HBASE-10592 Refactor PerformanceEvaluation tool (ndimiduk: rev 1572293)

          • /hbase/branches/0.96/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java
          • /hbase/branches/0.96/hbase-server/src/test/java/org/apache/hadoop/hbase/mapreduce/TestHFileOutputFormat.java
          • /hbase/branches/0.96/hbase-server/src/test/java/org/apache/hadoop/hbase/mapreduce/TestHFileOutputFormat2.java
          Show
          Hudson added a comment - FAILURE: Integrated in hbase-0.96 #318 (See https://builds.apache.org/job/hbase-0.96/318/ ) HBASE-10592 Refactor PerformanceEvaluation tool (ndimiduk: rev 1572293) /hbase/branches/0.96/hbase-server/src/test/java/org/apache/hadoop/hbase/PerformanceEvaluation.java /hbase/branches/0.96/hbase-server/src/test/java/org/apache/hadoop/hbase/mapreduce/TestHFileOutputFormat.java /hbase/branches/0.96/hbase-server/src/test/java/org/apache/hadoop/hbase/mapreduce/TestHFileOutputFormat2.java
          Hide
          Nicolas Liochon added a comment -

          Committed to the branch 10070 as well.

          Show
          Nicolas Liochon added a comment - Committed to the branch 10070 as well.
          Hide
          stack added a comment -

          When I try to run a MR job I get this:

          4/07/15 22:02:27 INFO mapreduce.Job: Task Id : attempt_1405482830657_0004_m_000015_2, Status : FAILED
          Error: org.codehaus.jackson.map.exc.UnrecognizedPropertyException: Unrecognized field "blockEncoding" (Class org.apache.hadoop.hbase.PerformanceEvaluation$TestOptions), not marked as ignorable
           at [Source: java.io.StringReader@41c7d592; line: 1, column: 37] (through reference chain: org.apache.hadoop.hbase.TestOptions["blockEncoding"])
          	at org.codehaus.jackson.map.exc.UnrecognizedPropertyException.from(UnrecognizedPropertyException.java:53)
          	at org.codehaus.jackson.map.deser.StdDeserializationContext.unknownFieldException(StdDeserializationContext.java:246)
          	at org.codehaus.jackson.map.deser.StdDeserializer.reportUnknownProperty(StdDeserializer.java:604)
          	at org.codehaus.jackson.map.deser.StdDeserializer.handleUnknownProperty(StdDeserializer.java:590)
          	at org.codehaus.jackson.map.deser.BeanDeserializer.handleUnknownProperty(BeanDeserializer.java:689)
          	at org.codehaus.jackson.map.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:514)
          	at org.codehaus.jackson.map.deser.BeanDeserializer.deserialize(BeanDeserializer.java:350)
          	at org.codehaus.jackson.map.ObjectMapper._readMapAndClose(ObjectMapper.java:2402)
          	at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:1602)
          	at org.apache.hadoop.hbase.PerformanceEvaluation$EvaluationMapTask.map(PerformanceEvaluation.java:255)
          	at org.apache.hadoop.hbase.PerformanceEvaluation$EvaluationMapTask.map(PerformanceEvaluation.java:210)
          	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
          	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
          	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
          	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
          	at java.security.AccessController.doPrivileged(Native Method)
          	at javax.security.auth.Subject.doAs(Subject.java:415)
          	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
          	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
          

          Is I add a setter, it works. The new JSON serialization wants setters for all properties?

          Show
          stack added a comment - When I try to run a MR job I get this: 4/07/15 22:02:27 INFO mapreduce.Job: Task Id : attempt_1405482830657_0004_m_000015_2, Status : FAILED Error: org.codehaus.jackson.map.exc.UnrecognizedPropertyException: Unrecognized field "blockEncoding" ( Class org.apache.hadoop.hbase.PerformanceEvaluation$TestOptions), not marked as ignorable at [Source: java.io.StringReader@41c7d592; line: 1, column: 37] (through reference chain: org.apache.hadoop.hbase.TestOptions[ "blockEncoding" ]) at org.codehaus.jackson.map.exc.UnrecognizedPropertyException.from(UnrecognizedPropertyException.java:53) at org.codehaus.jackson.map.deser.StdDeserializationContext.unknownFieldException(StdDeserializationContext.java:246) at org.codehaus.jackson.map.deser.StdDeserializer.reportUnknownProperty(StdDeserializer.java:604) at org.codehaus.jackson.map.deser.StdDeserializer.handleUnknownProperty(StdDeserializer.java:590) at org.codehaus.jackson.map.deser.BeanDeserializer.handleUnknownProperty(BeanDeserializer.java:689) at org.codehaus.jackson.map.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:514) at org.codehaus.jackson.map.deser.BeanDeserializer.deserialize(BeanDeserializer.java:350) at org.codehaus.jackson.map.ObjectMapper._readMapAndClose(ObjectMapper.java:2402) at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:1602) at org.apache.hadoop.hbase.PerformanceEvaluation$EvaluationMapTask.map(PerformanceEvaluation.java:255) at org.apache.hadoop.hbase.PerformanceEvaluation$EvaluationMapTask.map(PerformanceEvaluation.java:210) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162) Is I add a setter, it works. The new JSON serialization wants setters for all properties?
          Hide
          stack added a comment -

          I added HBASE-11523 w/ suggested fix

          Show
          stack added a comment - I added HBASE-11523 w/ suggested fix
          Hide
          Enis Soztutar added a comment -

          Closing this issue after 0.99.0 release.

          Show
          Enis Soztutar added a comment - Closing this issue after 0.99.0 release.

            People

            • Assignee:
              Nick Dimiduk
              Reporter:
              Nick Dimiduk
            • Votes:
              0 Vote for this issue
              Watchers:
              10 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development