[INFO] Scanning for projects... [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Integration - Unit Tests 4.0.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [WARNING] The POM for org.glassfish:javax.el:jar:3.0.1-b06-SNAPSHOT is missing, no dependency information available [WARNING] The POM for org.glassfish:javax.el:jar:3.0.1-b07-SNAPSHOT is missing, no dependency information available [WARNING] The POM for org.glassfish:javax.el:jar:3.0.1-b08-SNAPSHOT is missing, no dependency information available [WARNING] The POM for org.glassfish:javax.el:jar:3.0.1-b11-SNAPSHOT is missing, no dependency information available [INFO] [INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-no-snapshots) @ hive-it-unit --- [INFO] [INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-banned-dependencies-licenses) @ hive-it-unit --- [INFO] [INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-banned-dependencies-logging) @ hive-it-unit --- [INFO] [INFO] --- maven-antrun-plugin:1.7:run (download-spark) @ hive-it-unit --- [INFO] Executing tasks main: [exec] /home/hiveptest/34.66.219.41-hiveptest-2/apache-github-source-source/itests/hive-unit [exec] + /bin/pwd [exec] + BASE_DIR=./target [exec] + HIVE_ROOT=./target/../../../ [exec] + DOWNLOAD_DIR=./../thirdparty [exec] + SPARK_VERSION=2.3.0 [exec] + mkdir -p ./../thirdparty [exec] + download http://d3jw87u4immizc.cloudfront.net/spark-tarball/spark-2.3.0-bin-hadoop3-beta1-without-hive.tgz spark [exec] + url=http://d3jw87u4immizc.cloudfront.net/spark-tarball/spark-2.3.0-bin-hadoop3-beta1-without-hive.tgz [exec] + finalName=spark [exec] ++ basename http://d3jw87u4immizc.cloudfront.net/spark-tarball/spark-2.3.0-bin-hadoop3-beta1-without-hive.tgz [exec] + tarName=spark-2.3.0-bin-hadoop3-beta1-without-hive.tgz [exec] + rm -rf ./target/spark [exec] + [[ ! -f ./../thirdparty/spark-2.3.0-bin-hadoop3-beta1-without-hive.tgz ]] [exec] + local md5File=spark-2.3.0-bin-hadoop3-beta1-without-hive.tgz.md5sum [exec] + curl -Sso ./../thirdparty/spark-2.3.0-bin-hadoop3-beta1-without-hive.tgz.md5sum http://d3jw87u4immizc.cloudfront.net/spark-tarball/spark-2.3.0-bin-hadoop3-beta1-without-hive.tgz.md5sum [exec] + cd ./../thirdparty [exec] + type md5sum [exec] + md5sum -c spark-2.3.0-bin-hadoop3-beta1-without-hive.tgz.md5sum [exec] spark-2.3.0-bin-hadoop3-beta1-without-hive.tgz: OK [exec] /home/hiveptest/34.66.219.41-hiveptest-2/apache-github-source-source/itests/hive-unit [exec] + type md5 [exec] ../bin/download-spark.sh: line 15: type: md5: not found [exec] + cd - [exec] + tar -zxf ./../thirdparty/spark-2.3.0-bin-hadoop3-beta1-without-hive.tgz -C ./target [exec] + mv ./target/spark-2.3.0-bin-hadoop3-beta1-without-hive ./target/spark [exec] + cp -f ./target/../../..//data/conf/spark/log4j2.properties ./target/spark/conf/ [INFO] Executed tasks [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (process-resource-bundles) @ hive-it-unit --- [INFO] [INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ hive-it-unit --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /home/hiveptest/34.66.219.41-hiveptest-2/apache-github-source-source/itests/hive-unit/src/main/resources [INFO] Copying 3 resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-it-unit --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.6.1:compile (default-compile) @ hive-it-unit --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.7:testResources (default-testResources) @ hive-it-unit --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /home/hiveptest/34.66.219.41-hiveptest-2/apache-github-source-source/itests/hive-unit/src/test/resources [INFO] Copying 3 resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-it-unit --- [INFO] Executing tasks main: [delete] Deleting directory /home/hiveptest/34.66.219.41-hiveptest-2/apache-github-source-source/itests/hive-unit/target/tmp [delete] Deleting directory /home/hiveptest/34.66.219.41-hiveptest-2/apache-github-source-source/itests/hive-unit/target/testconf [delete] Deleting directory /home/hiveptest/34.66.219.41-hiveptest-2/apache-github-source-source/itests/hive-unit/target/warehouse [mkdir] Created dir: /home/hiveptest/34.66.219.41-hiveptest-2/apache-github-source-source/itests/hive-unit/target/tmp [mkdir] Created dir: /home/hiveptest/34.66.219.41-hiveptest-2/apache-github-source-source/itests/hive-unit/target/warehouse [mkdir] Created dir: /home/hiveptest/34.66.219.41-hiveptest-2/apache-github-source-source/itests/hive-unit/target/testconf [copy] Copying 19 files to /home/hiveptest/34.66.219.41-hiveptest-2/apache-github-source-source/itests/hive-unit/target/testconf [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-metastore-scripts) @ hive-it-unit --- [INFO] Executing tasks main: [mkdir] Created dir: /home/hiveptest/34.66.219.41-hiveptest-2/apache-github-source-source/itests/hive-unit/target/tmp/scripts/metastore [copy] Copying 425 files to /home/hiveptest/34.66.219.41-hiveptest-2/apache-github-source-source/itests/hive-unit/target/tmp/scripts/metastore [copy] Copying 74 files to /home/hiveptest/34.66.219.41-hiveptest-2/apache-github-source-source/itests/hive-unit/target/tmp/scripts/metastore/upgrade [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.6.1:testCompile (default-testCompile) @ hive-it-unit --- [INFO] Compiling 1 source file to /home/hiveptest/34.66.219.41-hiveptest-2/apache-github-source-source/itests/hive-unit/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.21.0:test (default-test) @ hive-it-unit --- [INFO] [INFO] ------------------------------------------------------- [INFO] T E S T S [INFO] ------------------------------------------------------- [INFO] Running org.apache.hadoop.hive.ql.parse.TestReplAcidTablesBootstrapWithJsonMessage [ERROR] Tests run: 5, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 1,225.713 s <<< FAILURE! - in org.apache.hadoop.hive.ql.parse.TestReplAcidTablesBootstrapWithJsonMessage [ERROR] testBootstrapAcidTablesDuringIncrementalWithConcurrentWrites(org.apache.hadoop.hive.ql.parse.TestReplAcidTablesBootstrapWithJsonMessage) Time elapsed: 680.912 s <<< ERROR! java.lang.IllegalStateException: Notification events are missing in the meta store. at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getNextNotification(HiveMetaStoreClient.java:3195) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:212) at com.sun.proxy.$Proxy58.getNextNotification(Unknown Source) at org.apache.hadoop.hive.ql.metadata.events.EventUtils$MSClientNotificationFetcher.getNextNotificationEvents(EventUtils.java:107) at org.apache.hadoop.hive.ql.metadata.events.EventUtils$NotificationEventIterator.fetchNextBatch(EventUtils.java:159) at org.apache.hadoop.hive.ql.metadata.events.EventUtils$NotificationEventIterator.hasNext(EventUtils.java:189) at org.apache.hadoop.hive.ql.exec.repl.ReplDumpTask.incrementalDump(ReplDumpTask.java:227) at org.apache.hadoop.hive.ql.exec.repl.ReplDumpTask.execute(ReplDumpTask.java:121) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:212) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:103) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2709) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2361) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2028) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1788) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1782) at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:162) at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:223) at org.apache.hadoop.hive.ql.parse.WarehouseInstance.run(WarehouseInstance.java:227) at org.apache.hadoop.hive.ql.parse.WarehouseInstance.dump(WarehouseInstance.java:270) at org.apache.hadoop.hive.ql.parse.WarehouseInstance.dump(WarehouseInstance.java:265) at org.apache.hadoop.hive.ql.parse.WarehouseInstance.dump(WarehouseInstance.java:277) at org.apache.hadoop.hive.ql.parse.TestReplicationScenariosAcidTablesBootstrap.testBootstrapAcidTablesDuringIncrementalWithConcurrentWrites(TestReplicationScenariosAcidTablesBootstrap.java:328) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) at org.junit.runners.ParentRunner.run(ParentRunner.java:309) at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365) at org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159) at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:379) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:340) at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:125) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:413) [INFO] [INFO] Results: [INFO] [ERROR] Errors: [ERROR] TestReplAcidTablesBootstrapWithJsonMessage>TestReplicationScenariosAcidTablesBootstrap.testBootstrapAcidTablesDuringIncrementalWithConcurrentWrites:328 ยป IllegalState [INFO] [ERROR] Tests run: 5, Failures: 0, Errors: 1, Skipped: 0 [INFO] [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 20:51.954s [INFO] Finished at: Wed May 15 08:22:34 UTC 2019 [INFO] Final Memory: 88M/461M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.21.0:test (default-test) on project hive-it-unit: There are test failures. [ERROR] [ERROR] Please refer to /home/hiveptest/34.66.219.41-hiveptest-2/apache-github-source-source/itests/hive-unit/target/surefire-reports for the individual test results. [ERROR] Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream. [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException