Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
None
-
None
Attachments
Attachments
- HIVE-16122.patch
- 0.7 kB
- Slim Bouguerra
- HIVE-16112.5.patch
- 4 kB
- Slim Bouguerra
- HIVE-16112.4.patch
- 4 kB
- Slim Bouguerra
- HIVE-16112.3.patch
- 4 kB
- Slim Bouguerra
- HIVE-16112.2.patch
- 0.7 kB
- Slim Bouguerra
Activity
bslim, I have been taking a look and we pass the hosts to the constructor of the superclass, which should set the locations? Could you share the stacktrace?
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12856294/HIVE-16122.patch
ERROR: -1 due to build exiting with an error
Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/3966/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/3966/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-3966/
Messages:
Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Tests exited with: NonZeroExitCodeException Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N' 2017-03-06 17:56:11.275 + [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]] + export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m ' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m ' + export 'MAVEN_OPTS=-Xmx1g ' + MAVEN_OPTS='-Xmx1g ' + cd /data/hiveptest/working/ + tee /data/hiveptest/logs/PreCommit-HIVE-Build-3966/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ git = \s\v\n ]] + [[ git = \g\i\t ]] + [[ -z master ]] + [[ -d apache-github-source-source ]] + [[ ! -d apache-github-source-source/.git ]] + [[ ! -d apache-github-source-source ]] + date '+%Y-%m-%d %T.%3N' 2017-03-06 17:56:11.277 + cd apache-github-source-source + git fetch origin + git reset --hard HEAD HEAD is now at 4904ab7 HIVE-16034: Hive/Druid integration: Fix type inference for Decimal DruidOutputFormat (Jesus Camacho Rodriguez, reviewed by Ashutosh Chauhan) + git clean -f -d + git checkout master Already on 'master' Your branch is up-to-date with 'origin/master'. + git reset --hard origin/master HEAD is now at 4904ab7 HIVE-16034: Hive/Druid integration: Fix type inference for Decimal DruidOutputFormat (Jesus Camacho Rodriguez, reviewed by Ashutosh Chauhan) + git merge --ff-only origin/master Already up-to-date. + date '+%Y-%m-%d %T.%3N' 2017-03-06 17:56:12.343 + patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hiveptest/working/scratch/build.patch + [[ -f /data/hiveptest/working/scratch/build.patch ]] + chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh + /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch error: a/druid-handler/src/java/org/apache/hadoop/hive/druid/io/HiveDruidSplit.java: No such file or directory The patch does not appear to apply with p0, p1, or p2 + exit 1 '
This message is automatically generated.
ATTACHMENT ID: 12856294 - PreCommit-HIVE-Build
stack trace
Caused by: java.io.IOException: java.lang.NullPointerException at org.apache.hadoop.hive.druid.serde.DruidQueryRecordReader.initialize(DruidQueryRecordReader.java:101) at org.apache.hadoop.hive.druid.serde.DruidGroupByQueryRecordReader.initialize(DruidGroupByQueryRecordReader.java:55) at org.apache.hadoop.hive.druid.io.DruidQueryBasedInputFormat.getRecordReader(DruidQueryBasedInputFormat.java:484) at org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:365) at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat$TezGroupedSplitsRecordReader.initNextRecordReader(TezGroupedSplitsInputFormat.java:203) at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat$TezGroupedSplitsRecordReader.<init>(TezGroupedSplitsInputFormat.java:145) at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat.getRecordReader(TezGroupedSplitsInputFormat.java:111) at org.apache.tez.mapreduce.lib.MRReaderMapred.setupOldRecordReader(MRReaderMapred.java:157) at org.apache.tez.mapreduce.lib.MRReaderMapred.setSplit(MRReaderMapred.java:83) at org.apache.tez.mapreduce.input.MRInput.initFromEventInternal(MRInput.java:700) at org.apache.tez.mapreduce.input.MRInput.initFromEvent(MRInput.java:659) at org.apache.tez.mapreduce.input.MRInputLegacy.checkAndAwaitRecordReaderInitialization(MRInputLegacy.java:150) at org.apache.tez.mapreduce.input.MRInputLegacy.init(MRInputLegacy.java:114) at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.getMRInput(MapRecordProcessor.java:529) at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.init(MapRecordProcessor.java:173) at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:184) at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:168) at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:370) at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73) at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61) at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37) at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) at org.apache.hadoop.hive.druid.serde.DruidQueryRecordReader.initialize(DruidQueryRecordReader.java:104) at org.apache.hadoop.hive.druid.serde.DruidGroupByQueryRecordReader.initialize(DruidGroupByQueryRecordReader.java:55) at org.apache.hadoop.hive.druid.io.DruidQueryBasedInputFormat.getRecordReader(DruidQueryBasedInputFormat.java:484) at org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:365)
jcamachorodriguez i have added the stack trace and seems like the issue still on.
bslim, I think the problem is in the read and write methods in HiveDruidSplit; observe that I forgot to change it accordingly when I added the field hosts to the object, which should be read/written.
Or maybe not, as the super method should take care of that...
i think host it self is empty missing the address of the broker in case of the query is not a select query.
jcamachorodriguez i don't think the super method is doing the correct read/write of hosts.
https://gist.github.com/b-slim/cb52c9bd4ebf7154ef20ac7d39a3b410
The new code in read/write looks good.
However, about this:
public String[] getLocations() throws IOException { return hosts == null ? new String[0] : hosts; }
We should not hit hosts == null anymore, right? I am afraid something like this will mask possible errors.
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12856652/HIVE-16112.5.patch
SUCCESS: +1 due to 1 test(s) being added or modified.
ERROR: -1 due to 2 failed/errored test(s), 10330 tests executed
Failed tests:
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[schema_evol_text_vec_table] (batchId=147) org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[vector_between_in] (batchId=119)
Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/4000/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/4000/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-4000/
Messages:
Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 2 tests failed
This message is automatically generated.
ATTACHMENT ID: 12856652 - PreCommit-HIVE-Build
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12856652/HIVE-16112.5.patch
SUCCESS: +1 due to 1 test(s) being added or modified.
ERROR: -1 due to 6 failed/errored test(s), 10330 tests executed
Failed tests:
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[schema_evol_text_vec_table] (batchId=147) org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query14] (batchId=224) org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query23] (batchId=224) org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[vector_between_in] (batchId=119) org.apache.hive.hcatalog.pig.TestRCFileHCatStorer.testWriteDecimalXY (batchId=173) org.apache.hive.hcatalog.pig.TestTextFileHCatStorer.testWriteSmallint (batchId=173)
Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/4002/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/4002/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-4002/
Messages:
Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 6 tests failed
This message is automatically generated.
ATTACHMENT ID: 12856652 - PreCommit-HIVE-Build
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12856652/HIVE-16112.5.patch
SUCCESS: +1 due to 1 test(s) being added or modified.
ERROR: -1 due to 4 failed/errored test(s), 10330 tests executed
Failed tests:
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[schema_evol_text_vec_table] (batchId=147) org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query14] (batchId=224) org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[vector_between_in] (batchId=119) org.apache.hive.service.server.TestHS2HttpServer.testContextRootUrlRewrite (batchId=186)
Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/4003/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/4003/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-4003/
Messages:
Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 4 tests failed
This message is automatically generated.
ATTACHMENT ID: 12856652 - PreCommit-HIVE-Build
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12856652/HIVE-16112.5.patch
SUCCESS: +1 due to 1 test(s) being added or modified.
ERROR: -1 due to 2 failed/errored test(s), 10331 tests executed
Failed tests:
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[schema_evol_text_vec_table] (batchId=147) org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[vector_between_in] (batchId=119)
Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/4004/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/4004/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-4004/
Messages:
Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 2 tests failed
This message is automatically generated.
ATTACHMENT ID: 12856652 - PreCommit-HIVE-Build
ashutoshc and jcamachorodriguez can you please look at this.