Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-3062

ShutdownHookManager is only available in Hadoop 2.x

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Blocker
    • Resolution: Fixed
    • Affects Version/s: 1.0.2
    • Fix Version/s: 1.1.0
    • Component/s: SQL
    • Labels:
      None
    • Target Version/s:

      Description

      PR #1891 leverages ShutdownHookManager to avoid IOException when EventLogging is enabled. But unfortunately ShutdownHookManager is only available in Hadoop 2.x. Compilation fails when building Spark with Hadoop 1.

      $ ./sbt/sbt -Phive-thriftserver
      ...
      [ERROR] /home/spark/software/source/compile/spark/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala:30: object ShutdownHookManager is not a member of package org.apache.hadoop.util
      [ERROR] import org.apache.hadoop.util.ShutdownHookManager
      [ERROR]        ^
      [ERROR] /home/spark/software/source/compile/spark/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala:125: not found: value ShutdownHookManager
      [ERROR]     ShutdownHookManager.get.addShutdownHook(
      [ERROR]     ^‍
      [WARNING] one warning found
      [ERROR] two errors found‍
      

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                Unassigned
                Reporter:
                lian cheng Cheng Lian
              • Votes:
                0 Vote for this issue
                Watchers:
                2 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: