Hadoop HDFS
  1. Hadoop HDFS
  2. HDFS-435 Add orthogonal fault injection mechanism/framework
  3. HDFS-475

Create a separate targets for fault injection related test and jar files creation files

    Details

    • Type: Sub-task Sub-task
    • Status: Closed
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: 0.21.0
    • Fix Version/s: 0.21.0
    • Component/s: build
    • Labels:
      None
    • Hadoop Flags:
      Reviewed

      Description

      Current implementation of the FI framework allows to mix faults into production classes, e.g. into build/ folder.
      Although the default probability level is set to zero it doesn't look clean and might potentially over complicate the build and release process.

      FI related targets are better be logically and physically separated, e.g. to put instrumented artifacts into a separate folder, say, build-fi/

      1. HDFS-475.patch
        3 kB
        Konstantin Boudnik
      2. HDFS-475.patch
        8 kB
        Konstantin Boudnik
      3. HDFS-475.patch
        8 kB
        Konstantin Boudnik
      4. HDFS-475.patch
        8 kB
        Konstantin Boudnik
      5. HDFS-475.patch
        8 kB
        Konstantin Boudnik
      6. h475_20090716ignore.patch
        0.5 kB
        Tsz Wo Nicholas Sze

        Issue Links

          Activity

          Hide
          Hudson added a comment -

          Integrated in Hadoop-Hdfs-trunk #25 (See http://hudson.zones.apache.org/hudson/job/Hadoop-Hdfs-trunk/25/)
          . Add new ant targets for fault injection jars and tests. Contributed by Konstantin Boudnik

          Show
          Hudson added a comment - Integrated in Hadoop-Hdfs-trunk #25 (See http://hudson.zones.apache.org/hudson/job/Hadoop-Hdfs-trunk/25/ ) . Add new ant targets for fault injection jars and tests. Contributed by Konstantin Boudnik
          Hide
          Tsz Wo Nicholas Sze added a comment -

          I have committed this. Thanks, Cos!

          Show
          Tsz Wo Nicholas Sze added a comment - I have committed this. Thanks, Cos!
          Hide
          Tsz Wo Nicholas Sze added a comment -

          h475_20090716ignore.patch: add build-fi to the ignore lists.

          Show
          Tsz Wo Nicholas Sze added a comment - h475_20090716ignore.patch: add build-fi to the ignore lists.
          Hide
          Tsz Wo Nicholas Sze added a comment -

          Also tried

          ant run-test-hdfs-fault-inject -Dtestcase=TestFiDataTransferProtocol
          

          with the patch posted in HDFS-483. It works.

          Note that the patch changes the ant target name as shown below.

          -  <target name="injectfaults" depends="compile" description="Weaves aspects into precomplied HDFS classes">
          +  <target name="compile-fault-inject" depends="compile-core, compile-hdfs-test">
          

          "injectfaults" was introduced by HDFS-436 which is committed to 0.21. I think this is not an incompatible change since 0.21 is not yet released.

          Show
          Tsz Wo Nicholas Sze added a comment - Also tried ant run-test-hdfs-fault-inject -Dtestcase=TestFiDataTransferProtocol with the patch posted in HDFS-483 . It works. Note that the patch changes the ant target name as shown below. - <target name="injectfaults" depends="compile" description="Weaves aspects into precomplied HDFS classes"> + <target name="compile-fault-inject" depends="compile-core, compile-hdfs-test"> "injectfaults" was introduced by HDFS-436 which is committed to 0.21. I think this is not an incompatible change since 0.21 is not yet released.
          Hide
          Tsz Wo Nicholas Sze added a comment -

          +1 patch looks good.

          Tried all the ant targets listed above. It works fine.

          Show
          Tsz Wo Nicholas Sze added a comment - +1 patch looks good. Tried all the ant targets listed above. It works fine.
          Hide
          Konstantin Boudnik added a comment -

          Current list of fault injection targets looks like as follows:

          jar-fault-inject                    Make hadoop-fi.jar
          jar-hdfs-test-fault-inject          Make hadoop-test-fi.jar
          jar-hdfswithmr-test-fault-inject    Make hadoop-hdfswithmr-test-fi.jar
          jar-test-fault-inject               Make hadoop-test.jar files
          run-test-hdfs-fault-inject          Run Fault Injection related hdfs tests
          run-test-hdfs-with-mr-fault-inject  Run hdfs Fault Injection related unit tests that require mapred
          
          Show
          Konstantin Boudnik added a comment - Current list of fault injection targets looks like as follows: jar-fault-inject Make hadoop-fi.jar jar-hdfs-test-fault-inject Make hadoop-test-fi.jar jar-hdfswithmr-test-fault-inject Make hadoop-hdfswithmr-test-fi.jar jar-test-fault-inject Make hadoop-test.jar files run-test-hdfs-fault-inject Run Fault Injection related hdfs tests run-test-hdfs-with-mr-fault-inject Run hdfs Fault Injection related unit tests that require mapred
          Hide
          Konstantin Boudnik added a comment -

          Including modifications of the target names as mentioned by HDFS-476

          Show
          Konstantin Boudnik added a comment - Including modifications of the target names as mentioned by HDFS-476
          Hide
          Konstantin Boudnik added a comment -

          Removing unnecessary extra-indentation

          Show
          Konstantin Boudnik added a comment - Removing unnecessary extra-indentation
          Hide
          Konstantin Boudnik added a comment -

          HDFS-476 will be fixed by this patch as well although in a slightly different manner. Now in order to run tests with injected faults one doesn't need to run 'injectfaults' target separately. It could be taken care off in a single command:

            ant run-hdfs-test-fi -DTestFiXxx
          

          I believe this is the essence of HDFS-476.

          Also, this new version of the patch provides new targets to create dev. and test jar files with included FI instrumentation.

          Please be advised, that Hudson's test-patch output can't be provided for this patch, because HDFS's Hudson doesn't run new targets right now.

          Show
          Konstantin Boudnik added a comment - HDFS-476 will be fixed by this patch as well although in a slightly different manner. Now in order to run tests with injected faults one doesn't need to run 'injectfaults' target separately. It could be taken care off in a single command: ant run-hdfs-test-fi -DTestFiXxx I believe this is the essence of HDFS-476 . Also, this new version of the patch provides new targets to create dev. and test jar files with included FI instrumentation. Please be advised, that Hudson's test-patch output can't be provided for this patch, because HDFS's Hudson doesn't run new targets right now.
          Hide
          Konstantin Boudnik added a comment -

          The latest patch version includes Nicholas' modifications to include src/test/aop as a source for additional tests (FI specific ones). The only modification I had to make in suggested addition is to replace the dependency of 'injectfaults' target from 'compile' to 'compile-core'.

          It doesn't look to me that contrib and ant-tasks are related to the FI anyhow. Please correct me if I'm wrong.

          Show
          Konstantin Boudnik added a comment - The latest patch version includes Nicholas' modifications to include src/test/aop as a source for additional tests (FI specific ones). The only modification I had to make in suggested addition is to replace the dependency of 'injectfaults' target from 'compile' to 'compile-core'. It doesn't look to me that contrib and ant-tasks are related to the FI anyhow. Please correct me if I'm wrong.
          Hide
          Tsz Wo Nicholas Sze added a comment -

          Looks like that your patch also fixes HDFS-476. Is it the case?

          Show
          Tsz Wo Nicholas Sze added a comment - Looks like that your patch also fixes HDFS-476 . Is it the case?
          Hide
          Tsz Wo Nicholas Sze added a comment -

          In HDFS-483, I implemented some tests which use fi. I have to change the test target as following in order to run the tests. Do you want to add them to your patch?

          @@ -318,7 +318,7 @@
             <!-- Weaving aspects in place 
             	Later on one can run 'ant jar' to create Hadoop jar file with instrumented classes
             -->
          -  <target name="injectfaults" depends="compile" description="Weaves aspects into precomplied HDFS classes">
          +  <target name="injectfaults" depends="compile, compile-hdfs-test" description="Weaves aspects into precomplied HDFS classes">
               <!-- AspectJ task definition -->
               <taskdef resource="org/aspectj/tools/ant/taskdefs/aspectjTaskdefs.properties">
                 <classpath>
          @@ -335,7 +335,7 @@
                 target="${javac.version}"
                 source="${javac.version}"
                 deprecation="${javac.deprecation}">
          -        <classpath refid="classpath" />
          +        <classpath refid="test.classpath" />
               </iajc>
               <echo message="Weaving of aspects is finished"/>
             </target>
          @@ -500,10 +500,14 @@
                 <batchtest todir="${test.build.dir}" unless="testcase">
                   <fileset dir="${test.src.dir}/hdfs"
                      includes="**/${test.include}.java"
          -     excludes="**/${test.exclude}.java" />
          +           excludes="**/${test.exclude}.java" />
          +        <fileset dir="${test.src.dir}/aop"
          +           includes="**/${test.include}.java"
          +           excludes="**/${test.exclude}.java" />
                 </batchtest>
                 <batchtest todir="${test.build.dir}" if="testcase">
                   <fileset dir="${test.src.dir}/hdfs" includes="**/${testcase}.java"/>
          +        <fileset dir="${test.src.dir}/aop" includes="**/${testcase}.java"/>
                 </batchtest>
               </junit>
               <antcall target="checkfailure"/>
          
          Show
          Tsz Wo Nicholas Sze added a comment - In HDFS-483 , I implemented some tests which use fi. I have to change the test target as following in order to run the tests. Do you want to add them to your patch? @@ -318,7 +318,7 @@ <!-- Weaving aspects in place Later on one can run 'ant jar' to create Hadoop jar file with instrumented classes --> - <target name= "injectfaults" depends= "compile" description= "Weaves aspects into precomplied HDFS classes" > + <target name= "injectfaults" depends= "compile, compile-hdfs-test" description= "Weaves aspects into precomplied HDFS classes" > <!-- AspectJ task definition --> <taskdef resource= "org/aspectj/tools/ant/taskdefs/aspectjTaskdefs.properties" > <classpath> @@ -335,7 +335,7 @@ target= "${javac.version}" source= "${javac.version}" deprecation= "${javac.deprecation}" > - <classpath refid= "classpath" /> + <classpath refid= "test.classpath" /> </iajc> <echo message= "Weaving of aspects is finished" /> </target> @@ -500,10 +500,14 @@ <batchtest todir= "${test.build.dir}" unless= "testcase" > <fileset dir= "${test.src.dir}/hdfs" includes= "**/${test.include}.java" - excludes= "**/${test.exclude}.java" /> + excludes= "**/${test.exclude}.java" /> + <fileset dir= "${test.src.dir}/aop" + includes= "**/${test.include}.java" + excludes= "**/${test.exclude}.java" /> </batchtest> <batchtest todir= "${test.build.dir}" if = "testcase" > <fileset dir= "${test.src.dir}/hdfs" includes= "**/${testcase}.java" /> + <fileset dir= "${test.src.dir}/aop" includes= "**/${testcase}.java" /> </batchtest> </junit> <antcall target= "checkfailure" />
          Hide
          Konstantin Boudnik added a comment -

          The patch adds the following high level targets:

          • jar-fi Make hadoop-fi.jar
          • jar-hdfs-test-fi Make hadoop-test-fi.jar
          • jar-hdfswithmr-test-fi Make hadoop-test-fi.jar
          • jar-test-fi Make hadoop-test.jar
          • run-test-hdfs-fi Run FI related hdfs tests
          • run-test-hdfs-with-mr-fi Run hdfs FI related unit tests that require mapred
          Show
          Konstantin Boudnik added a comment - The patch adds the following high level targets: jar-fi Make hadoop-fi.jar jar-hdfs-test-fi Make hadoop-test-fi.jar jar-hdfswithmr-test-fi Make hadoop-test-fi.jar jar-test-fi Make hadoop-test.jar run-test-hdfs-fi Run FI related hdfs tests run-test-hdfs-with-mr-fi Run hdfs FI related unit tests that require mapred
          Hide
          Tsz Wo Nicholas Sze added a comment -

          +1 patch looks good. I will wait for one, two days before committing this.

          Show
          Tsz Wo Nicholas Sze added a comment - +1 patch looks good. I will wait for one, two days before committing this.
          Hide
          Konstantin Boudnik added a comment -

          The test failure is unrelated to the patch for it has been broken from the very first build of HDFS
          http://hudson.zones.apache.org/hudson/job/Hdfs-Patch-vesta.apache.org/0/

          Show
          Konstantin Boudnik added a comment - The test failure is unrelated to the patch for it has been broken from the very first build of HDFS http://hudson.zones.apache.org/hudson/job/Hdfs-Patch-vesta.apache.org/0/
          Hide
          Hadoop QA added a comment -

          -1 overall. Here are the results of testing the latest attachment
          http://issues.apache.org/jira/secure/attachment/12412910/HDFS-475.patch
          against trunk revision 792310.

          +1 @author. The patch does not contain any @author tags.

          -1 tests included. The patch doesn't appear to include any new or modified tests.
          Please justify why no new tests are needed for this patch.
          Also please list what manual steps were performed to verify this patch.

          +1 javadoc. The javadoc tool did not generate any warning messages.

          +1 javac. The applied patch does not increase the total number of javac compiler warnings.

          +1 findbugs. The patch does not introduce any new Findbugs warnings.

          +1 release audit. The applied patch does not increase the total number of release audit warnings.

          -1 core tests. The patch failed core unit tests.

          -1 contrib tests. The patch failed contrib unit tests.

          Test results: http://hudson.zones.apache.org/hudson/job/Hdfs-Patch-vesta.apache.org/9/testReport/
          Findbugs warnings: http://hudson.zones.apache.org/hudson/job/Hdfs-Patch-vesta.apache.org/9/artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html
          Checkstyle results: http://hudson.zones.apache.org/hudson/job/Hdfs-Patch-vesta.apache.org/9/artifact/trunk/build/test/checkstyle-errors.html
          Console output: http://hudson.zones.apache.org/hudson/job/Hdfs-Patch-vesta.apache.org/9/console

          This message is automatically generated.

          Show
          Hadoop QA added a comment - -1 overall. Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12412910/HDFS-475.patch against trunk revision 792310. +1 @author. The patch does not contain any @author tags. -1 tests included. The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. +1 javadoc. The javadoc tool did not generate any warning messages. +1 javac. The applied patch does not increase the total number of javac compiler warnings. +1 findbugs. The patch does not introduce any new Findbugs warnings. +1 release audit. The applied patch does not increase the total number of release audit warnings. -1 core tests. The patch failed core unit tests. -1 contrib tests. The patch failed contrib unit tests. Test results: http://hudson.zones.apache.org/hudson/job/Hdfs-Patch-vesta.apache.org/9/testReport/ Findbugs warnings: http://hudson.zones.apache.org/hudson/job/Hdfs-Patch-vesta.apache.org/9/artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html Checkstyle results: http://hudson.zones.apache.org/hudson/job/Hdfs-Patch-vesta.apache.org/9/artifact/trunk/build/test/checkstyle-errors.html Console output: http://hudson.zones.apache.org/hudson/job/Hdfs-Patch-vesta.apache.org/9/console This message is automatically generated.
          Hide
          Konstantin Boudnik added a comment -

          The first bullet point has to be read as:

          • new target 'jar-fi' is added to create devepment jar file with the classes
            instrumented by injected faults. The name of this new tar file differs from
            normal dev.jar and looks like hadoop-hdfs-0.21.0-dev-fi.jar


          With best regards,
          Konstantin Boudnik (aka Cos)

          Yahoo! Grid Computing
          +1 (408) 349-4049

          2CAC 8312 4870 D885 8616 6115 220F 6980 1F27 E622
          Attention! Streams of consciousness are disallowed

          Show
          Konstantin Boudnik added a comment - The first bullet point has to be read as: new target 'jar-fi' is added to create devepment jar file with the classes instrumented by injected faults. The name of this new tar file differs from normal dev.jar and looks like hadoop-hdfs-0.21.0-dev-fi.jar – With best regards, Konstantin Boudnik (aka Cos) Yahoo! Grid Computing +1 (408) 349-4049 2CAC 8312 4870 D885 8616 6115 220F 6980 1F27 E622 Attention! Streams of consciousness are disallowed
          Hide
          Konstantin Boudnik added a comment -

          This patch contains the following modifications of build.xml file:

          • new tar 'jar-fi' is added to create devepment jar file with the classes instrumented by injected faults. The name of this new tar file differs from normal dev.jar and looks like hadoop-hdfs-0.21.0-dev-fi.jar
          • target 'jar' has been modified to depend on 'clean' in order to guarantee that only non fault-injected classes are built and included into normal development jar file. Performance implications of additional cleaning seem to be very low if any.

          Target 'tar' hasn't been altered for I don't see why FI'ed stuff has to be included into the release tarball.

          Show
          Konstantin Boudnik added a comment - This patch contains the following modifications of build.xml file: new tar 'jar-fi' is added to create devepment jar file with the classes instrumented by injected faults. The name of this new tar file differs from normal dev.jar and looks like hadoop-hdfs-0.21.0-dev-fi.jar target 'jar' has been modified to depend on 'clean' in order to guarantee that only non fault-injected classes are built and included into normal development jar file. Performance implications of additional cleaning seem to be very low if any. Target 'tar' hasn't been altered for I don't see why FI'ed stuff has to be included into the release tarball.

            People

            • Assignee:
              Konstantin Boudnik
              Reporter:
              Konstantin Boudnik
            • Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development