Details

    • Sub-task
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 1.1.0
    • Spark
    • None

    Description

      For the time being, at least, we've decided to build the Spark client (see SPARK-3215) inside Hive. This task tracks merging the ongoing work into the Spark branch.

      Attachments

        1. HIVE-8528.1-spark.patch
          105 kB
          Xuefu Zhang
        2. HIVE-8528.1-spark-client.patch
          105 kB
          Marcelo Masiero Vanzin
        3. HIVE-8528.2-spark.patch
          105 kB
          Xuefu Zhang
        4. HIVE-8528.2-spark.patch
          105 kB
          Marcelo Masiero Vanzin
        5. HIVE-8528.3-spark.patch
          101 kB
          Marcelo Masiero Vanzin

        Issue Links

        Activity

          vanzin Marcelo Masiero Vanzin created issue -
          vanzin Marcelo Masiero Vanzin made changes -
          Field Original Value New Value
          Attachment 0001-HIVE-8528-Add-Spark-Client.patch [ 12675978 ]
          xuefuz Xuefu Zhang added a comment -

          Hi Marcelo Masiero Vanzin, thank for working on this. For patch to trigger test, it needs to be named as something like HIVE-8528.1-spark.patch. Thanks.

          xuefuz Xuefu Zhang added a comment - Hi Marcelo Masiero Vanzin , thank for working on this. For patch to trigger test, it needs to be named as something like HIVE-8528 .1-spark.patch. Thanks.
          vanzin Marcelo Masiero Vanzin made changes -
          Attachment 0001-HIVE-8528-Add-Spark-Client.patch [ 12675978 ]
          vanzin Marcelo Masiero Vanzin made changes -
          Attachment HIVE-8528-spark-client.patch [ 12675984 ]
          xuefuz Xuefu Zhang made changes -
          Assignee Marcelo Vanzin [ vanzin ]
          vanzin Marcelo Masiero Vanzin made changes -
          Attachment HIVE-8528-spark-client.patch [ 12675984 ]
          vanzin Marcelo Masiero Vanzin made changes -
          Attachment HIVE-8528.1-spark-client.patch [ 12675990 ]
          xuefuz Xuefu Zhang added a comment -

          Load the same patch but with different name to trigger the test run. For information only, patch needs to be postfixed with spark.patch in order to make this happen.

          xuefuz Xuefu Zhang added a comment - Load the same patch but with different name to trigger the test run. For information only, patch needs to be postfixed with spark.patch in order to make this happen.
          xuefuz Xuefu Zhang made changes -
          Attachment HIVE-8528.1-spark.patch [ 12676014 ]
          xuefuz Xuefu Zhang made changes -
          Status Open [ 1 ] Patch Available [ 10002 ]
          hiveqa Hive QA added a comment -

          Overall: -1 at least one tests failed

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12676014/HIVE-8528.1-spark.patch

          ERROR: -1 due to 13 failed/errored test(s), 6782 tests executed
          Failed tests:

          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sample_islocalmode_hook
          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_tez_smb_1
          org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_vectorization_12
          org.apache.hadoop.hive.ql.security.TestExtendedAcls.testAlterPartition
          org.apache.hadoop.hive.ql.security.TestExtendedAcls.testInsert
          org.apache.hadoop.hive.ql.security.TestExtendedAcls.testLoad
          org.apache.hadoop.hive.ql.security.TestExtendedAcls.testLoadLocal
          org.apache.hadoop.hive.ql.security.TestExtendedAcls.testStaticPartition
          org.apache.hadoop.hive.ql.security.TestFolderPermissions.testAlterPartition
          org.apache.hadoop.hive.ql.security.TestFolderPermissions.testInsert
          org.apache.hadoop.hive.ql.security.TestFolderPermissions.testLoad
          org.apache.hadoop.hive.ql.security.TestFolderPermissions.testLoadLocal
          org.apache.hadoop.hive.ql.security.TestFolderPermissions.testStaticPartition
          

          Test results: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/244/testReport
          Console output: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/244/console
          Test logs: http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-SPARK-Build-244/

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          Tests exited with: TestsFailedException: 13 tests failed
          

          This message is automatically generated.

          ATTACHMENT ID: 12676014 - PreCommit-HIVE-SPARK-Build

          hiveqa Hive QA added a comment - Overall : -1 at least one tests failed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12676014/HIVE-8528.1-spark.patch ERROR: -1 due to 13 failed/errored test(s), 6782 tests executed Failed tests: org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sample_islocalmode_hook org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_tez_smb_1 org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_vectorization_12 org.apache.hadoop.hive.ql.security.TestExtendedAcls.testAlterPartition org.apache.hadoop.hive.ql.security.TestExtendedAcls.testInsert org.apache.hadoop.hive.ql.security.TestExtendedAcls.testLoad org.apache.hadoop.hive.ql.security.TestExtendedAcls.testLoadLocal org.apache.hadoop.hive.ql.security.TestExtendedAcls.testStaticPartition org.apache.hadoop.hive.ql.security.TestFolderPermissions.testAlterPartition org.apache.hadoop.hive.ql.security.TestFolderPermissions.testInsert org.apache.hadoop.hive.ql.security.TestFolderPermissions.testLoad org.apache.hadoop.hive.ql.security.TestFolderPermissions.testLoadLocal org.apache.hadoop.hive.ql.security.TestFolderPermissions.testStaticPartition Test results: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/244/testReport Console output: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/244/console Test logs: http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-SPARK-Build-244/ Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 13 tests failed This message is automatically generated. ATTACHMENT ID: 12676014 - PreCommit-HIVE-SPARK-Build
          xuefuz Xuefu Zhang added a comment -

          Marcelo Masiero Vanzin, I'm wondering if the test failures are caused by the junit version change introduced in the patch. Could you take a look please? I'm not sure if it's necessary to upgrade junit. Please comment.

          xuefuz Xuefu Zhang added a comment - Marcelo Masiero Vanzin , I'm wondering if the test failures are caused by the junit version change introduced in the patch. Could you take a look please? I'm not sure if it's necessary to upgrade junit. Please comment.

          I upgraded because my tests use assertNotEquals which was added in 4.11. I'll revert that change and change the test to see if it fixes the issues.

          vanzin Marcelo Masiero Vanzin added a comment - I upgraded because my tests use assertNotEquals which was added in 4.11. I'll revert that change and change the test to see if it fixes the issues.

          Ah, I'll also have to update the code to match changes in the Spark API, so it will take a little longer...

          vanzin Marcelo Masiero Vanzin added a comment - Ah, I'll also have to update the code to match changes in the Spark API, so it will take a little longer...
          vanzin Marcelo Masiero Vanzin made changes -
          Attachment HIVE-8528.2-spark.patch [ 12676117 ]
          hiveqa Hive QA added a comment -

          Overall: -1 no tests executed

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12676117/HIVE-8528.2-spark.patch

          Test results: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/246/testReport
          Console output: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/246/console
          Test logs: http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-SPARK-Build-246/

          Messages:

          **** This message was trimmed, see log for full details ****
          
          As a result, alternative(s) 6 were disabled for that input
          warning(200): IdentifiersParser.g:179:5: 
          Decision can match input such as "KW_ARRAY" using multiple alternatives: 2, 6
          
          As a result, alternative(s) 6 were disabled for that input
          warning(200): IdentifiersParser.g:179:5: 
          Decision can match input such as "KW_UNIONTYPE" using multiple alternatives: 5, 6
          
          As a result, alternative(s) 6 were disabled for that input
          warning(200): IdentifiersParser.g:261:5: 
          Decision can match input such as "KW_TRUE" using multiple alternatives: 3, 8
          
          As a result, alternative(s) 8 were disabled for that input
          warning(200): IdentifiersParser.g:261:5: 
          Decision can match input such as "KW_NULL" using multiple alternatives: 1, 8
          
          As a result, alternative(s) 8 were disabled for that input
          warning(200): IdentifiersParser.g:261:5: 
          Decision can match input such as "KW_FALSE" using multiple alternatives: 3, 8
          
          As a result, alternative(s) 8 were disabled for that input
          warning(200): IdentifiersParser.g:261:5: 
          Decision can match input such as "KW_DATE StringLiteral" using multiple alternatives: 2, 3
          
          As a result, alternative(s) 3 were disabled for that input
          warning(200): IdentifiersParser.g:393:5: 
          Decision can match input such as "KW_BETWEEN KW_MAP LPAREN" using multiple alternatives: 8, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:393:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_INTO" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:393:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_UNION KW_ALL" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:393:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_SORT KW_BY" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:393:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_CLUSTER KW_BY" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:393:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_LATERAL KW_VIEW" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:393:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_MAP LPAREN" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:393:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_DISTRIBUTE KW_BY" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:393:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_ORDER KW_BY" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:393:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_GROUP KW_BY" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:393:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_OVERWRITE" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:518:5: 
          Decision can match input such as "{AMPERSAND..BITWISEXOR, DIV..DIVIDE, EQUAL..EQUAL_NS, GREATERTHAN..GREATERTHANOREQUALTO, KW_AND, KW_ARRAY, KW_BETWEEN..KW_BOOLEAN, KW_CASE, KW_DOUBLE, KW_FLOAT, KW_IF, KW_IN, KW_INT, KW_LIKE, KW_MAP, KW_NOT, KW_OR, KW_REGEXP, KW_RLIKE, KW_SMALLINT, KW_STRING..KW_STRUCT, KW_TINYINT, KW_UNIONTYPE, KW_WHEN, LESSTHAN..LESSTHANOREQUALTO, MINUS..NOTEQUAL, PLUS, STAR, TILDE}" using multiple alternatives: 1, 3
          
          As a result, alternative(s) 3 were disabled for that input
          [INFO] 
          [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-exec ---
          [INFO] 
          [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hive-exec ---
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 2 resources
          [INFO] Copying 3 resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec ---
          [INFO] Compiling 1967 source files to /data/hive-ptest/working/apache-svn-spark-source/ql/target/classes
          [INFO] -------------------------------------------------------------
          [WARNING] COMPILATION WARNING : 
          [INFO] -------------------------------------------------------------
          [WARNING] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java: Some input files use or override a deprecated API.
          [WARNING] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java: Recompile with -Xlint:deprecation for details.
          [WARNING] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java: Some input files use unchecked or unsafe operations.
          [WARNING] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java: Recompile with -Xlint:unchecked for details.
          [INFO] 4 warnings 
          [INFO] -------------------------------------------------------------
          [INFO] -------------------------------------------------------------
          [ERROR] COMPILATION ERROR : 
          [INFO] -------------------------------------------------------------
          [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkClient.java:[190,50] incompatible types
            required: org.apache.spark.FutureAction
            found:    org.apache.spark.api.java.JavaFutureAction<java.lang.Void>
          [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HivePairFlatMapFunction.java:[46,53] cannot find symbol
            symbol:   method getStageId()
            location: class org.apache.spark.TaskContext
          [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HivePairFlatMapFunction.java:[57,70] cannot find symbol
            symbol:   method getPartitionId()
            location: class org.apache.spark.TaskContext
          [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HivePairFlatMapFunction.java:[63,62] cannot find symbol
            symbol:   method getPartitionId()
            location: class org.apache.spark.TaskContext
          [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkUtilities.java:[30,56] cannot find symbol
            symbol:   method getPartitionId()
            location: variable taskContext of type org.apache.spark.TaskContext
          [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkUtilities.java:[35,47] cannot find symbol
            symbol:   method getPartitionId()
            location: variable taskContext of type org.apache.spark.TaskContext
          [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkUtilities.java:[35,77] cannot find symbol
            symbol:   method getAttemptId()
            location: variable taskContext of type org.apache.spark.TaskContext
          [INFO] 7 errors 
          [INFO] -------------------------------------------------------------
          [INFO] ------------------------------------------------------------------------
          [INFO] Reactor Summary:
          [INFO] 
          [INFO] Hive .............................................. SUCCESS [2.860s]
          [INFO] Hive Shims Common ................................. SUCCESS [2.529s]
          [INFO] Hive Shims 0.20 ................................... SUCCESS [0.913s]
          [INFO] Hive Shims Secure Common .......................... SUCCESS [1.574s]
          [INFO] Hive Shims 0.20S .................................. SUCCESS [0.819s]
          [INFO] Hive Shims 0.23 ................................... SUCCESS [2.144s]
          [INFO] Hive Shims ........................................ SUCCESS [0.321s]
          [INFO] Hive Common ....................................... SUCCESS [4.172s]
          [INFO] Hive Serde ........................................ SUCCESS [3.852s]
          [INFO] Hive Metastore .................................... SUCCESS [12.316s]
          [INFO] Hive Ant Utilities ................................ SUCCESS [0.538s]
          [INFO] Hive Query Language ............................... FAILURE [19.009s]
          [INFO] Hive Service ...................................... SKIPPED
          [INFO] Hive Accumulo Handler ............................. SKIPPED
          [INFO] Hive JDBC ......................................... SKIPPED
          [INFO] Hive Beeline ...................................... SKIPPED
          [INFO] Hive CLI .......................................... SKIPPED
          [INFO] Hive Contrib ...................................... SKIPPED
          [INFO] Hive HBase Handler ................................ SKIPPED
          [INFO] Hive HCatalog ..................................... SKIPPED
          [INFO] Hive HCatalog Core ................................ SKIPPED
          [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED
          [INFO] Hive HCatalog Server Extensions ................... SKIPPED
          [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED
          [INFO] Hive HCatalog Webhcat ............................. SKIPPED
          [INFO] Hive HCatalog Streaming ........................... SKIPPED
          [INFO] Hive HWI .......................................... SKIPPED
          [INFO] Hive ODBC ......................................... SKIPPED
          [INFO] Hive Shims Aggregator ............................. SKIPPED
          [INFO] Spark Remote Client ............................... SKIPPED
          [INFO] Hive TestUtils .................................... SKIPPED
          [INFO] Hive Packaging .................................... SKIPPED
          [INFO] ------------------------------------------------------------------------
          [INFO] BUILD FAILURE
          [INFO] ------------------------------------------------------------------------
          [INFO] Total time: 51.985s
          [INFO] Finished at: Tue Oct 21 13:15:36 EDT 2014
          [INFO] Final Memory: 90M/1067M
          [INFO] ------------------------------------------------------------------------
          [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-exec: Compilation failure: Compilation failure:
          [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkClient.java:[190,50] incompatible types
          [ERROR] required: org.apache.spark.FutureAction
          [ERROR] found:    org.apache.spark.api.java.JavaFutureAction<java.lang.Void>
          [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HivePairFlatMapFunction.java:[46,53] cannot find symbol
          [ERROR] symbol:   method getStageId()
          [ERROR] location: class org.apache.spark.TaskContext
          [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HivePairFlatMapFunction.java:[57,70] cannot find symbol
          [ERROR] symbol:   method getPartitionId()
          [ERROR] location: class org.apache.spark.TaskContext
          [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HivePairFlatMapFunction.java:[63,62] cannot find symbol
          [ERROR] symbol:   method getPartitionId()
          [ERROR] location: class org.apache.spark.TaskContext
          [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkUtilities.java:[30,56] cannot find symbol
          [ERROR] symbol:   method getPartitionId()
          [ERROR] location: variable taskContext of type org.apache.spark.TaskContext
          [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkUtilities.java:[35,47] cannot find symbol
          [ERROR] symbol:   method getPartitionId()
          [ERROR] location: variable taskContext of type org.apache.spark.TaskContext
          [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkUtilities.java:[35,77] cannot find symbol
          [ERROR] symbol:   method getAttemptId()
          [ERROR] location: variable taskContext of type org.apache.spark.TaskContext
          [ERROR] -> [Help 1]
          [ERROR] 
          [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
          [ERROR] Re-run Maven using the -X switch to enable full debug logging.
          [ERROR] 
          [ERROR] For more information about the errors and possible solutions, please read the following articles:
          [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
          [ERROR] 
          [ERROR] After correcting the problems, you can resume the build with the command
          [ERROR]   mvn <goals> -rf :hive-exec
          '
          

          This message is automatically generated.

          ATTACHMENT ID: 12676117 - PreCommit-HIVE-SPARK-Build

          hiveqa Hive QA added a comment - Overall : -1 no tests executed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12676117/HIVE-8528.2-spark.patch Test results: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/246/testReport Console output: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/246/console Test logs: http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-SPARK-Build-246/ Messages: **** This message was trimmed, see log for full details **** As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:179:5: Decision can match input such as "KW_ARRAY" using multiple alternatives: 2, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:179:5: Decision can match input such as "KW_UNIONTYPE" using multiple alternatives: 5, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:261:5: Decision can match input such as "KW_TRUE" using multiple alternatives: 3, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:261:5: Decision can match input such as "KW_NULL" using multiple alternatives: 1, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:261:5: Decision can match input such as "KW_FALSE" using multiple alternatives: 3, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:261:5: Decision can match input such as "KW_DATE StringLiteral" using multiple alternatives: 2, 3 As a result, alternative(s) 3 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "KW_BETWEEN KW_MAP LPAREN" using multiple alternatives: 8, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_INTO" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_UNION KW_ALL" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_SORT KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_CLUSTER KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_LATERAL KW_VIEW" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_MAP LPAREN" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_DISTRIBUTE KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_ORDER KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_GROUP KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_OVERWRITE" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:518:5: Decision can match input such as "{AMPERSAND..BITWISEXOR, DIV..DIVIDE, EQUAL..EQUAL_NS, GREATERTHAN..GREATERTHANOREQUALTO, KW_AND, KW_ARRAY, KW_BETWEEN..KW_BOOLEAN, KW_CASE, KW_DOUBLE, KW_FLOAT, KW_IF, KW_IN, KW_INT, KW_LIKE, KW_MAP, KW_NOT, KW_OR, KW_REGEXP, KW_RLIKE, KW_SMALLINT, KW_STRING..KW_STRUCT, KW_TINYINT, KW_UNIONTYPE, KW_WHEN, LESSTHAN..LESSTHANOREQUALTO, MINUS..NOTEQUAL, PLUS, STAR, TILDE}" using multiple alternatives: 1, 3 As a result, alternative(s) 3 were disabled for that input [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-exec --- [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hive-exec --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 2 resources [INFO] Copying 3 resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec --- [INFO] Compiling 1967 source files to /data/hive-ptest/working/apache-svn-spark-source/ql/target/classes [INFO] ------------------------------------------------------------- [WARNING] COMPILATION WARNING : [INFO] ------------------------------------------------------------- [WARNING] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java: Some input files use or override a deprecated API. [WARNING] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java: Recompile with -Xlint:deprecation for details. [WARNING] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java: Some input files use unchecked or unsafe operations. [WARNING] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java: Recompile with -Xlint:unchecked for details. [INFO] 4 warnings [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------- [ERROR] COMPILATION ERROR : [INFO] ------------------------------------------------------------- [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkClient.java:[190,50] incompatible types required: org.apache.spark.FutureAction found: org.apache.spark.api.java.JavaFutureAction<java.lang.Void> [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HivePairFlatMapFunction.java:[46,53] cannot find symbol symbol: method getStageId() location: class org.apache.spark.TaskContext [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HivePairFlatMapFunction.java:[57,70] cannot find symbol symbol: method getPartitionId() location: class org.apache.spark.TaskContext [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HivePairFlatMapFunction.java:[63,62] cannot find symbol symbol: method getPartitionId() location: class org.apache.spark.TaskContext [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkUtilities.java:[30,56] cannot find symbol symbol: method getPartitionId() location: variable taskContext of type org.apache.spark.TaskContext [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkUtilities.java:[35,47] cannot find symbol symbol: method getPartitionId() location: variable taskContext of type org.apache.spark.TaskContext [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkUtilities.java:[35,77] cannot find symbol symbol: method getAttemptId() location: variable taskContext of type org.apache.spark.TaskContext [INFO] 7 errors [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Hive .............................................. SUCCESS [2.860s] [INFO] Hive Shims Common ................................. SUCCESS [2.529s] [INFO] Hive Shims 0.20 ................................... SUCCESS [0.913s] [INFO] Hive Shims Secure Common .......................... SUCCESS [1.574s] [INFO] Hive Shims 0.20S .................................. SUCCESS [0.819s] [INFO] Hive Shims 0.23 ................................... SUCCESS [2.144s] [INFO] Hive Shims ........................................ SUCCESS [0.321s] [INFO] Hive Common ....................................... SUCCESS [4.172s] [INFO] Hive Serde ........................................ SUCCESS [3.852s] [INFO] Hive Metastore .................................... SUCCESS [12.316s] [INFO] Hive Ant Utilities ................................ SUCCESS [0.538s] [INFO] Hive Query Language ............................... FAILURE [19.009s] [INFO] Hive Service ...................................... SKIPPED [INFO] Hive Accumulo Handler ............................. SKIPPED [INFO] Hive JDBC ......................................... SKIPPED [INFO] Hive Beeline ...................................... SKIPPED [INFO] Hive CLI .......................................... SKIPPED [INFO] Hive Contrib ...................................... SKIPPED [INFO] Hive HBase Handler ................................ SKIPPED [INFO] Hive HCatalog ..................................... SKIPPED [INFO] Hive HCatalog Core ................................ SKIPPED [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED [INFO] Hive HCatalog Server Extensions ................... SKIPPED [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED [INFO] Hive HCatalog Webhcat ............................. SKIPPED [INFO] Hive HCatalog Streaming ........................... SKIPPED [INFO] Hive HWI .......................................... SKIPPED [INFO] Hive ODBC ......................................... SKIPPED [INFO] Hive Shims Aggregator ............................. SKIPPED [INFO] Spark Remote Client ............................... SKIPPED [INFO] Hive TestUtils .................................... SKIPPED [INFO] Hive Packaging .................................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 51.985s [INFO] Finished at: Tue Oct 21 13:15:36 EDT 2014 [INFO] Final Memory: 90M/1067M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-exec: Compilation failure: Compilation failure: [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkClient.java:[190,50] incompatible types [ERROR] required: org.apache.spark.FutureAction [ERROR] found: org.apache.spark.api.java.JavaFutureAction<java.lang.Void> [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HivePairFlatMapFunction.java:[46,53] cannot find symbol [ERROR] symbol: method getStageId() [ERROR] location: class org.apache.spark.TaskContext [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HivePairFlatMapFunction.java:[57,70] cannot find symbol [ERROR] symbol: method getPartitionId() [ERROR] location: class org.apache.spark.TaskContext [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HivePairFlatMapFunction.java:[63,62] cannot find symbol [ERROR] symbol: method getPartitionId() [ERROR] location: class org.apache.spark.TaskContext [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkUtilities.java:[30,56] cannot find symbol [ERROR] symbol: method getPartitionId() [ERROR] location: variable taskContext of type org.apache.spark.TaskContext [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkUtilities.java:[35,47] cannot find symbol [ERROR] symbol: method getPartitionId() [ERROR] location: variable taskContext of type org.apache.spark.TaskContext [ERROR] /data/hive-ptest/working/apache-svn-spark-source/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkUtilities.java:[35,77] cannot find symbol [ERROR] symbol: method getAttemptId() [ERROR] location: variable taskContext of type org.apache.spark.TaskContext [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hive-exec ' This message is automatically generated. ATTACHMENT ID: 12676117 - PreCommit-HIVE-SPARK-Build

          Hmmm, seems the rest of the Spark-related code in Hive needs to be updated to match the recent changes in Spark...

          vanzin Marcelo Masiero Vanzin added a comment - Hmmm, seems the rest of the Spark-related code in Hive needs to be updated to match the recent changes in Spark...
          xuefuz Xuefu Zhang added a comment -

          Attach the same patch #2 to retrigger the test run.

          xuefuz Xuefu Zhang added a comment - Attach the same patch #2 to retrigger the test run.
          xuefuz Xuefu Zhang made changes -
          Attachment HIVE-8528.2-spark.patch [ 12676126 ]
          hiveqa Hive QA added a comment -

          Overall: -1 at least one tests failed

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12676126/HIVE-8528.2-spark.patch

          ERROR: -1 due to 2 failed/errored test(s), 6782 tests executed
          Failed tests:

          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sample_islocalmode_hook
          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_tez_smb_1
          

          Test results: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/247/testReport
          Console output: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/247/console
          Test logs: http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-SPARK-Build-247/

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          Tests exited with: TestsFailedException: 2 tests failed
          

          This message is automatically generated.

          ATTACHMENT ID: 12676126 - PreCommit-HIVE-SPARK-Build

          hiveqa Hive QA added a comment - Overall : -1 at least one tests failed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12676126/HIVE-8528.2-spark.patch ERROR: -1 due to 2 failed/errored test(s), 6782 tests executed Failed tests: org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sample_islocalmode_hook org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_tez_smb_1 Test results: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/247/testReport Console output: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/247/console Test logs: http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-SPARK-Build-247/ Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 2 tests failed This message is automatically generated. ATTACHMENT ID: 12676126 - PreCommit-HIVE-SPARK-Build
          xuefuz Xuefu Zhang added a comment -

          Marcelo Masiero Vanzin, it seems tests passed (remaining failure are known). Awesome! Could you please create a review board entry for easier code review? Thanks.

          xuefuz Xuefu Zhang added a comment - Marcelo Masiero Vanzin , it seems tests passed (remaining failure are known). Awesome! Could you please create a review board entry for easier code review? Thanks.
          vanzin Marcelo Masiero Vanzin added a comment - https://reviews.apache.org/r/26993/
          xuefuz Xuefu Zhang added a comment -

          With my limitted knowledge in Spark, akka, etc. the patch looks good. I only have a few comments about code styling, logging, etc.

          Chengxiang Li and Rui Li, could you please take a look at the patch. Especially, Chengxiang Li will do the integration, so please make sure it meets our need.

          For any found deficiencies in functionality, we can create followup JIRAs to track.

          xuefuz Xuefu Zhang added a comment - With my limitted knowledge in Spark, akka, etc. the patch looks good. I only have a few comments about code styling, logging, etc. Chengxiang Li and Rui Li , could you please take a look at the patch. Especially, Chengxiang Li will do the integration, so please make sure it meets our need. For any found deficiencies in functionality, we can create followup JIRAs to track.
          chengxiang li Chengxiang Li added a comment -

          Yes, Xuefu Zhang, I would like to do that, thanks Marcelo Masiero Vanzin's great work.

          chengxiang li Chengxiang Li added a comment - Yes, Xuefu Zhang , I would like to do that, thanks Marcelo Masiero Vanzin 's great work.
          xuefuz Xuefu Zhang made changes -
          Link This issue is depended upon by HIVE-8548 [ HIVE-8548 ]
          lirui Rui Li added a comment -

          Forgive my ignorance, just some high level questions:

          • Why would we want this remote spark client?
          • With the change, it seems instead of creating a SparkContext locally, user now has to create a SparkClient, which will create the SparkContext in separate process (within RemoteDriver). Would this remote SparkContext be shared somehow?
          • Do we plan to make this the default mode for hive on spark?
          lirui Rui Li added a comment - Forgive my ignorance, just some high level questions: Why would we want this remote spark client? With the change, it seems instead of creating a SparkContext locally, user now has to create a SparkClient, which will create the SparkContext in separate process (within RemoteDriver). Would this remote SparkContext be shared somehow? Do we plan to make this the default mode for hive on spark?
          xuefuz Xuefu Zhang added a comment -

          Hi Rui Li, SPARK-3215 has the relevent info and a document. It should help answer some if not all your questions. Please let us know if you have additional questions after reading those. Thanks.

          xuefuz Xuefu Zhang added a comment - Hi Rui Li , SPARK-3215 has the relevent info and a document. It should help answer some if not all your questions. Please let us know if you have additional questions after reading those. Thanks.
          lirui Rui Li added a comment -

          Xuefu Zhang - thanks for pointing me to the detailed info. I guess the main use case for hive is in hive server or when the client has limited resources?

          lirui Rui Li added a comment - Xuefu Zhang - thanks for pointing me to the detailed info. I guess the main use case for hive is in hive server or when the client has limited resources?
          xuefuz Xuefu Zhang added a comment -

          It's mainly for HiveSever2, where there are concurrent client sessions, each having its SparkContext object. SparkContext is heavy, putting memory pressure on HiveServer2, proportionally to the number of active client sessions. With remote SparkContext, the pressure is transfered to remote process.

          xuefuz Xuefu Zhang added a comment - It's mainly for HiveSever2, where there are concurrent client sessions, each having its SparkContext object. SparkContext is heavy, putting memory pressure on HiveServer2, proportionally to the number of active client sessions. With remote SparkContext, the pressure is transfered to remote process.
          lirui Rui Li added a comment -

          Yep, I see. Thanks for explaining

          lirui Rui Li added a comment - Yep, I see. Thanks for explaining
          chengxiang li Chengxiang Li added a comment -

          I'm +1 on this patch on RB(seems not updated here yet), while with one addition that we need a flow control for Metrics, otherwise it would lead to a memory leak finally, althrough I think it could be tracked in an independent ticket.

          chengxiang li Chengxiang Li added a comment - I'm +1 on this patch on RB(seems not updated here yet), while with one addition that we need a flow control for Metrics, otherwise it would lead to a memory leak finally, althrough I think it could be tracked in an independent ticket.
          vanzin Marcelo Masiero Vanzin made changes -
          Attachment HIVE-8528.3-spark.patch [ 12676621 ]

          Yeah, the metrics code in general is a little hacky and sort of ugly to use. I need to spend more time thinking about it.

          vanzin Marcelo Masiero Vanzin added a comment - Yeah, the metrics code in general is a little hacky and sort of ugly to use. I need to spend more time thinking about it.
          xuefuz Xuefu Zhang added a comment -

          Thanks to Chengxiang Li for the review. Marcelo Masiero Vanzin, would you mind creating a JIRA to track the metrics flow control issue? Thanks.

          xuefuz Xuefu Zhang added a comment - Thanks to Chengxiang Li for the review. Marcelo Masiero Vanzin , would you mind creating a JIRA to track the metrics flow control issue? Thanks.
          hiveqa Hive QA added a comment -

          Overall: -1 at least one tests failed

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12676621/HIVE-8528.3-spark.patch

          ERROR: -1 due to 2 failed/errored test(s), 6782 tests executed
          Failed tests:

          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sample_islocalmode_hook
          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_tez_smb_1
          

          Test results: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/255/testReport
          Console output: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/255/console
          Test logs: http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-SPARK-Build-255/

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          Tests exited with: TestsFailedException: 2 tests failed
          

          This message is automatically generated.

          ATTACHMENT ID: 12676621 - PreCommit-HIVE-SPARK-Build

          hiveqa Hive QA added a comment - Overall : -1 at least one tests failed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12676621/HIVE-8528.3-spark.patch ERROR: -1 due to 2 failed/errored test(s), 6782 tests executed Failed tests: org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sample_islocalmode_hook org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_tez_smb_1 Test results: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/255/testReport Console output: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/255/console Test logs: http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-SPARK-Build-255/ Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 2 tests failed This message is automatically generated. ATTACHMENT ID: 12676621 - PreCommit-HIVE-SPARK-Build
          xuefuz Xuefu Zhang added a comment -

          +1

          xuefuz Xuefu Zhang added a comment - +1
          xuefuz Xuefu Zhang added a comment -

          Patch committed to Spark branch. Thanks to Marcelo for the contribution.

          xuefuz Xuefu Zhang added a comment - Patch committed to Spark branch. Thanks to Marcelo for the contribution.
          xuefuz Xuefu Zhang made changes -
          Fix Version/s spark-branch [ 12327352 ]
          Resolution Fixed [ 1 ]
          Status Patch Available [ 10002 ] Resolved [ 5 ]
          xuefuz Xuefu Zhang made changes -
          Link This issue is related to HIVE-8574 [ HIVE-8574 ]
          leftyl Lefty Leverenz made changes -
          Link This issue relates to SPARK-3215 [ SPARK-3215 ]
          leftyl Lefty Leverenz added a comment -

          Does this need any documentation, or is it already covered in the PDF attached to SPARK-3215?

          leftyl Lefty Leverenz added a comment - Does this need any documentation, or is it already covered in the PDF attached to SPARK-3215 ? SPARK-3215 – RemoteSparkContext.pdf Hive on Spark: Getting Started

          Hi Lefty, what kind of documentation are you looking for? This is, at the moment, targeted at internal Hive use only, so having nice end-user documentation is not currently a goal. (In fact, I should probably go and add those annotations to the classes.)

          vanzin Marcelo Masiero Vanzin added a comment - Hi Lefty, what kind of documentation are you looking for? This is, at the moment, targeted at internal Hive use only, so having nice end-user documentation is not currently a goal. (In fact, I should probably go and add those annotations to the classes.)
          leftyl Lefty Leverenz made changes -
          Labels TODOC
          leftyl Lefty Leverenz made changes -
          Labels TODOC
          xuefuz Xuefu Zhang added a comment -

          Hi [~leftylev], As explained by Marcelo, it's our goal to hide this completely from the user, and thus user doc doesn't need to cover it. Thanks for asking though.

          xuefuz Xuefu Zhang added a comment - Hi [~leftylev] , As explained by Marcelo, it's our goal to hide this completely from the user, and thus user doc doesn't need to cover it. Thanks for asking though.
          leftyl Lefty Leverenz added a comment -

          Well, if it needs any setup or configuration you could put that in the "Hive on Spark: Getting Started" doc. Usage notes too. If it will need documentation later on, you can add the label TODOC-SPARK. (Oops, I added a TODOC label by accident while checking the name of the Spark label. Gone now.) A release note could hold some doc notes, but it won't get published from the branch. Of course these comments can hold doc notes too.

          I just want to make sure docs don't get forgotten, but maybe this doesn't need any docs. Thanks.

          leftyl Lefty Leverenz added a comment - Well, if it needs any setup or configuration you could put that in the "Hive on Spark: Getting Started" doc. Usage notes too. If it will need documentation later on, you can add the label TODOC-SPARK. (Oops, I added a TODOC label by accident while checking the name of the Spark label. Gone now.) A release note could hold some doc notes, but it won't get published from the branch. Of course these comments can hold doc notes too. I just want to make sure docs don't get forgotten, but maybe this doesn't need any docs. Thanks.
          leftyl Lefty Leverenz added a comment -

          Oh, messages crossing paths – thanks Xuefu Zhang & Marcelo Masiero Vanzin.

          leftyl Lefty Leverenz added a comment - Oh, messages crossing paths – thanks Xuefu Zhang & Marcelo Masiero Vanzin .

          Actually, Left, that's a good point, this might need some end-user documentation since the recommended setup is to have a full Spark installation available on the HS2 node. I don't know if the plan is to somehow package that with HS2 or leave it as a configuration step.

          vanzin Marcelo Masiero Vanzin added a comment - Actually, Left, that's a good point, this might need some end-user documentation since the recommended setup is to have a full Spark installation available on the HS2 node. I don't know if the plan is to somehow package that with HS2 or leave it as a configuration step.
          xuefuz Xuefu Zhang added a comment -

          Marcelo Masiero Vanzin, I thought spark installation on HS2 host was optional. Let me know if this has changed.

          xuefuz Xuefu Zhang added a comment - Marcelo Masiero Vanzin , I thought spark installation on HS2 host was optional. Let me know if this has changed.

          It is optional, but I don't really think we should encourage that. A full install should be the recommended setup.

          vanzin Marcelo Masiero Vanzin added a comment - It is optional, but I don't really think we should encourage that. A full install should be the recommended setup.
          xuefuz Xuefu Zhang added a comment -

          Got it. Thanks for the clarification.

          xuefuz Xuefu Zhang added a comment - Got it. Thanks for the clarification.
          xuefuz Xuefu Zhang made changes -
          Labels TODOC-SPARK
          leftyl Lefty Leverenz added a comment -

          Doc note: Szehon Ho added a "Remote Spark Drive" section to Configuration Properties with a nice overview. (Thanks, Szehon.)

          leftyl Lefty Leverenz added a comment - Doc note: Szehon Ho added a "Remote Spark Drive" section to Configuration Properties with a nice overview. (Thanks, Szehon.) Configuration Properties – Remote Spark Driver
          xuefuz Xuefu Zhang made changes -
          Fix Version/s 1.1.0 [ 12329363 ]
          Fix Version/s spark-branch [ 12327352 ]
          leftyl Lefty Leverenz made changes -
          Labels TODOC-SPARK
          dfoulks Drew Foulks made changes -
          Workflow no-reopen-closed, patch-avail [ 12898007 ] Hive - no-reopen-closed, patch-avail [ 14127369 ]

          People

            vanzin Marcelo Masiero Vanzin Assign to me
            vanzin Marcelo Masiero Vanzin
            Votes:
            0 Vote for this issue
            Watchers:
            Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                In order to see discussions, first confirm access to your Slack account(s) in the following workspace(s): ASF