Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
None
-
None
-
None
Description
Running the following on MiniTez
set hive.mapred.mode=nonstrict; SET hive.vectorized.execution.enabled=true; SET hive.exec.orc.default.buffer.size=32768; SET hive.exec.orc.default.row.index.stride=1000; SET hive.optimize.index.filter=true; set hive.fetch.task.conversion=none; set hive.exec.dynamic.partition.mode=nonstrict; DROP TABLE orc_a; DROP TABLE orc_b; CREATE TABLE orc_a (id bigint, cdouble double) partitioned by (y int, q smallint) CLUSTERED BY (id) SORTED BY (id) INTO 2 BUCKETS stored as orc; CREATE TABLE orc_b (id bigint, cfloat float) CLUSTERED BY (id) SORTED BY (id) INTO 2 BUCKETS stored as orc; insert into table orc_a partition (y=2000, q) select cbigint, cdouble, csmallint % 10 from alltypesorc where cbigint is not null and csmallint > 0 order by cbigint asc; insert into table orc_a partition (y=2001, q) select cbigint, cdouble, csmallint % 10 from alltypesorc where cbigint is not null and csmallint > 0 order by cbigint asc; insert into table orc_b select cbigint, cfloat from alltypesorc where cbigint is not null and csmallint > 0 order by cbigint asc limit 200; set hive.cbo.enable=false; select y,q,count(*) from orc_a a join orc_b b on a.id=b.id group by y,q; set hive.enforce.sortmergebucketmapjoin=false; set hive.optimize.bucketmapjoin=true; set hive.optimize.bucketmapjoin.sortedmerge=true; set hive.auto.convert.sortmerge.join=true; set hive.auto.convert.join=true; set hive.auto.convert.join.noconditionaltask.size=10; explain select y,q,count(*) from orc_a a join orc_b b on a.id=b.id group by y,q; select y,q,count(*) from orc_a a join orc_b b on a.id=b.id group by y,q; DROP TABLE orc_a; DROP TABLE orc_b;
Produces different results for the two selects. The SMB one looks incorrect. cc djaiswal hagleitn
Attachments
Attachments
- HIVE-16965.8.patch
- 7 kB
- Deepak Jaiswal
- HIVE-16965.7.patch
- 7 kB
- Deepak Jaiswal
- HIVE-16965.6.patch
- 7 kB
- Deepak Jaiswal
- HIVE-16965.5.patch
- 6 kB
- Deepak Jaiswal
- HIVE-16965.4.patch
- 6 kB
- Deepak Jaiswal
- HIVE-16965.3.patch
- 36 kB
- Deepak Jaiswal
- HIVE-16965.2.patch
- 50 kB
- Deepak Jaiswal
- HIVE-16965.1.patch
- 4 kB
- Deepak Jaiswal
Issue Links
- is duplicated by
-
HIVE-16791 Tez engine giving inaccurate results on SMB Map joins while map-join and shuffle join gets correct results
- Resolved
- relates to
-
HIVE-16981 hive.optimize.bucketingsorting should compare the schema before removing RS
- Closed
-
HIVE-16761 LLAP IO: SMB joins fail elevator
- Closed
Activity
Map by KV reader looks a little suspicious, what is the hashcode/equals of that? Is it valid and also acceptable in terms of perf? Should it be identity hash map?
(HiveInputFormat.HiveInputSplit) splits.get(0) - assumes one element, add an assert?
Why is path updated in IO context if we already set a specific one per input, perhaps a more detailed comment could be added
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12878673/HIVE-16965.2.patch
SUCCESS: +1 due to 1 test(s) being added or modified.
ERROR: -1 due to 10 failed/errored test(s), 11093 tests executed
Failed tests:
org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[llap_smb] (batchId=144) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[smb_join1] (batchId=157) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[subquery_scalar] (batchId=153) org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[spark_dynamic_partition_pruning] (batchId=168) org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[spark_vectorized_dynamic_partition_pruning] (batchId=168) org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query14] (batchId=235) org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query23] (batchId=235) org.apache.hive.hcatalog.api.TestHCatClient.testPartitionRegistrationWithCustomSchema (batchId=179) org.apache.hive.hcatalog.api.TestHCatClient.testPartitionSpecRegistrationWithCustomSchema (batchId=179) org.apache.hive.hcatalog.api.TestHCatClient.testTableSchemaPropagation (batchId=179)
Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/6123/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/6123/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-6123/
Messages:
Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 10 tests failed
This message is automatically generated.
ATTACHMENT ID: 12878673 - PreCommit-HIVE-Build
sershe Thanks for the comments.
Somehow lost the assert while making the code pretty. Applying all your comments in a patch coming in shortly.
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12878695/HIVE-16965.3.patch
SUCCESS: +1 due to 1 test(s) being added or modified.
ERROR: -1 due to 10 failed/errored test(s), 11098 tests executed
Failed tests:
org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[llap_smb] (batchId=144) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[auto_smb_mapjoin_14] (batchId=157) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[smb_mapjoin_17] (batchId=145) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[subquery_scalar] (batchId=153) org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[spark_dynamic_partition_pruning] (batchId=168) org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[spark_vectorized_dynamic_partition_pruning] (batchId=168) org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query14] (batchId=235) org.apache.hive.hcatalog.api.TestHCatClient.testPartitionRegistrationWithCustomSchema (batchId=179) org.apache.hive.hcatalog.api.TestHCatClient.testPartitionSpecRegistrationWithCustomSchema (batchId=179) org.apache.hive.hcatalog.api.TestHCatClient.testTableSchemaPropagation (batchId=179)
Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/6126/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/6126/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-6126/
Messages:
Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 10 tests failed
This message is automatically generated.
ATTACHMENT ID: 12878695 - PreCommit-HIVE-Build
Use llap_smb.q as main test instead of smb_join1.q
Remove assert based on failing tests.
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12879247/HIVE-16965.7.patch
ERROR: -1 due to build exiting with an error
Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/6159/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/6159/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-6159/
Messages:
Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Tests exited with: NonZeroExitCodeException Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N' 2017-07-27 23:55:16.127 + [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]] + export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m ' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m ' + export 'MAVEN_OPTS=-Xmx1g ' + MAVEN_OPTS='-Xmx1g ' + cd /data/hiveptest/working/ + tee /data/hiveptest/logs/PreCommit-HIVE-Build-6159/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ git = \s\v\n ]] + [[ git = \g\i\t ]] + [[ -z master ]] + [[ -d apache-github-source-source ]] + [[ ! -d apache-github-source-source/.git ]] + [[ ! -d apache-github-source-source ]] + date '+%Y-%m-%d %T.%3N' 2017-07-27 23:55:16.129 + cd apache-github-source-source + git fetch origin From https://github.com/apache/hive e15b2de..61d8b7c master -> origin/master + git reset --hard HEAD HEAD is now at e15b2de HIVE-17168 Create separate module for stand alone metastore (Alan Gates, reviewed by Vihang Karajgaonkar) + git clean -f -d Removing ql/src/java/org/apache/hadoop/hive/ql/optimizer/SparkRemoveDynamicPruning.java Removing ql/src/test/queries/clientpositive/spark_dynamic_partition_pruning_mapjoin_only.q Removing ql/src/test/results/clientpositive/spark/spark_dynamic_partition_pruning_mapjoin_only.q.out + git checkout master Already on 'master' Your branch is behind 'origin/master' by 1 commit, and can be fast-forwarded. (use "git pull" to update your local branch) + git reset --hard origin/master HEAD is now at 61d8b7c HIVE-17087: Remove unnecessary HoS DPP trees during map-join conversion (Sahil Takiar, reviewed by Liyun Zhang, Rui Li) + git merge --ff-only origin/master Already up-to-date. + date '+%Y-%m-%d %T.%3N' 2017-07-27 23:55:21.989 + patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hiveptest/working/scratch/build.patch + [[ -f /data/hiveptest/working/scratch/build.patch ]] + chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh + /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/tez/MapRecordSource.java: No such file or directory error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/tez/tools/KeyValueInputMerger.java: No such file or directory error: a/ql/src/test/results/clientpositive/llap/llap_smb.q.out: No such file or directory The patch does not appear to apply with p0, p1, or p2 + exit 1 '
This message is automatically generated.
ATTACHMENT ID: 12879247 - PreCommit-HIVE-Build
Last patch mysteriously failed in build. Recreated one after code refresh.
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12879285/HIVE-16965.8.patch
ERROR: -1 due to no test(s) being added or modified.
ERROR: -1 due to 10 failed/errored test(s), 11013 tests executed
Failed tests:
TestPerfCliDriver - did not produce a TEST-*.xml file (likely timed out) (batchId=235) org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[llap_smb] (batchId=144) org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[spark_dynamic_partition_pruning] (batchId=168) org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[spark_vectorized_dynamic_partition_pruning] (batchId=168) org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_2] (batchId=100) org.apache.hadoop.hive.metastore.TestHiveMetaStoreStatsMerge.testStatsMerge (batchId=206) org.apache.hive.hcatalog.api.TestHCatClient.testPartitionRegistrationWithCustomSchema (batchId=179) org.apache.hive.hcatalog.api.TestHCatClient.testPartitionSpecRegistrationWithCustomSchema (batchId=179) org.apache.hive.hcatalog.api.TestHCatClient.testTableSchemaPropagation (batchId=179) org.apache.hive.spark.client.TestSparkClient.testJobSubmission (batchId=288)
Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/6165/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/6165/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-6165/
Messages:
Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 10 tests failed
This message is automatically generated.
ATTACHMENT ID: 12879285 - PreCommit-HIVE-Build
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12879285/HIVE-16965.8.patch
ERROR: -1 due to no test(s) being added or modified.
ERROR: -1 due to 8 failed/errored test(s), 11013 tests executed
Failed tests:
TestPerfCliDriver - did not produce a TEST-*.xml file (likely timed out) (batchId=235) org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[spark_dynamic_partition_pruning] (batchId=168) org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[spark_vectorized_dynamic_partition_pruning] (batchId=168) org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainuser_3] (batchId=99) org.apache.hadoop.hive.metastore.TestHiveMetaStoreStatsMerge.testStatsMerge (batchId=206) org.apache.hive.hcatalog.api.TestHCatClient.testPartitionRegistrationWithCustomSchema (batchId=179) org.apache.hive.hcatalog.api.TestHCatClient.testPartitionSpecRegistrationWithCustomSchema (batchId=179) org.apache.hive.hcatalog.api.TestHCatClient.testTableSchemaPropagation (batchId=179)
Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/6167/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/6167/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-6167/
Messages:
Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 8 tests failed
This message is automatically generated.
ATTACHMENT ID: 12879285 - PreCommit-HIVE-Build
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12879285/HIVE-16965.8.patch
ERROR: -1 due to no test(s) being added or modified.
ERROR: -1 due to 9 failed/errored test(s), 11013 tests executed
Failed tests:
TestPerfCliDriver - did not produce a TEST-*.xml file (likely timed out) (batchId=235) org.apache.hadoop.hive.cli.TestBeeLineDriver.testCliDriver[materialized_view_create_rewrite] (batchId=240) org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[spark_vectorized_dynamic_partition_pruning] (batchId=168) org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_2] (batchId=100) org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainuser_3] (batchId=99) org.apache.hadoop.hive.metastore.TestHiveMetaStoreStatsMerge.testStatsMerge (batchId=206) org.apache.hive.hcatalog.api.TestHCatClient.testPartitionRegistrationWithCustomSchema (batchId=179) org.apache.hive.hcatalog.api.TestHCatClient.testPartitionSpecRegistrationWithCustomSchema (batchId=179) org.apache.hive.hcatalog.api.TestHCatClient.testTableSchemaPropagation (batchId=179)
Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/6172/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/6172/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-6172/
Messages:
Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 9 tests failed
This message is automatically generated.
ATTACHMENT ID: 12879285 - PreCommit-HIVE-Build
Initial patch.
Fixes the algorithm to provide correct IOContext for a given input.
In SMB, the inputs keep switching compared to traditional joins where inputs are read sequentially.
gopalvjderehagleitnsershe can you please review?