diff --git src/docbkx/developer.xml src/docbkx/developer.xml index 1b2852e..97efc30 100644 --- src/docbkx/developer.xml +++ src/docbkx/developer.xml @@ -310,11 +310,11 @@ given the dependency tree).
Unit Tests -HBase unit tests are subdivided into three categories: small, medium and large, with +HBase unit tests are subdivided into four categories: small, medium, large, and integration with corresponding JUnit categories: SmallTests, MediumTests, -LargeTests. JUnit categories are denoted using java annotations -and look like this in your unit test code. +LargeTests, IntegrationTests. +JUnit categories are denoted using java annotations and look like this in your unit test code. ... @Category(SmallTests.class) public class TestHRegionInfo { @@ -352,13 +352,20 @@ individually. They can use a cluster, and each of them is executed in a separate
<indexterm><primary>LargeTests</primary></indexterm> -Large tests are everything else. They are typically integration-like +Large tests are everything else. They are typically large-scale tests, regression tests for specific bugs, timeout tests, performance tests. They are executed before a commit on the pre-integration machines. They can be run on the developer machine as well.
+
+<indexterm><primary>IntegrationTests</primary></indexterm> +Integration tests are system level tests. See +integration tests section for more info. + +
+
Running tests Below we describe how to run the HBase junit categories. @@ -486,6 +493,97 @@ As most as possible, tests should use the default settings for the cluster. When
+ +
+Integration Tests +HBase integration/system tests are tests that are beyond HBase unit tests. They +are generally long-lasting, sizeable (the test can be asked to 1M rows or 1B rows), +targetable (they can take configuration that will point them at the ready-made cluster +they are to run against; integration tests do not include cluster start/stop code), +and verifying success, integration tests rely on public APIs only; they do not +attempt to examine server internals asserting success/fail. Integration tests +are what you would run when you need to more elaborate proofing of a release candidate +beyond what unit tests can do. They are not generally run on the Apache Continuous Integration +build server, however, some sites opt to run integration tests as a part of their +continuous testing on an actual cluster. + + +Integration tests currently live under the src/test directory +in the hbase-it submodule and will match the regex: **/IntegrationTest*.java. +All integration tests are also annotated with @Category(IntegrationTests.class). + + + +Integration tests can be run in two modes: using a mini cluster, or against an actual distributed cluster. +Maven failsafe is used to run the tests using the mini cluster. IntegrationTestsDriver class is used for +executing the tests against a distributed cluster. Integration tests SHOULD NOT assume that they are running against a +mini cluster, and SHOULD NOT use private API's to access cluster state. To interact with the distributed or mini +cluster uniformly, HBaseIntegrationTestingUtility, and HBaseCluster classes, +and public client API's can be used. + + +
+Running integration tests against mini cluster +HBase 0.92 added a verify maven target. +Invoking it, for example by doing mvn verify, will +run all the phases up to and including the verify phase via the +maven failsafe plugin, +running all the above mentioned HBase unit tests as well as tests that are in the HBase integration test group. +After you have completed + mvn install -DskipTests +You can run just the integration tests by invoking: + +cd hbase-it +mvn verify + +If you just want to run the integration tests in top-level, you need to run two commands. First: + mvn failsafe:integration-test +This actually runs ALL the integration tests. + This command will always output BUILD SUCCESS even if there are test failures. + + At this point, you could grep the output by hand looking for failed tests. However, maven will do this for us; just use: + mvn failsafe:verify + The above command basically looks at all the test results (so don't remove the 'target' directory) for test failures and reports the results. + +
+ Running a subset of Integration tests + This is very similar to how you specify running a subset of unit tests (see above), but use the property + it.test instead of test. +To just run IntegrationTestClassXYZ.java, use: + mvn failsafe:integration-test -Dit.test=IntegrationTestClassXYZ + Pretty similar, right? + The next thing you might want to do is run groups of integration tests, say all integration tests that are named IntegrationTestClassX*.java: + mvn failsafe:integration-test -Dit.test=*ClassX* + This runs everything that is an integration test that matches *ClassX*. This means anything matching: "**/IntegrationTest*ClassX*". + You can also run multiple groups of integration tests using comma-delimited lists (similar to unit tests). Using a list of matches still supports full regex matching for each of the groups.This would look something like: + mvn failsafe:integration-test -Dit.test=*ClassX*, *ClassY + +
+
+
+Running integration tests against distributed cluster + +If you have an already-setup HBase cluster, you can launch the integration tests by invoking the class IntegrationTestsDriver. You may have to +run test-compile first. +mvn test-compile +Then launch the tests with: +bin/hbase [--config config_dir] org.apache.hadoop.hbase.IntegrationTestsDriver + +This execution will launch the tests under hbase-it/src/test, having @Category(IntegrationTests.class) annotation, +and a name starting with IntegrationTests. It uses Junit to run the tests. Currently there is no support for running integration tests against a distributed cluster using maven (see HBASE-6201). + +
+ +
+Destructive integration / system tests + + In 0.96, a tool named ChaosMonkey has been introduced. It is modeled after the same-named tool by Netflix. +Some of the tests use ChaosMonkey to simulate faults in the running cluster in the way of killing random servers, +disconnecting servers, etc. ChaosMonkey can also be used as a stand-alone tool to run a (misbehaving) policy while you +are running other tests. + +
+