Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-2459

the user should be able to configure the resources used by JDBC server

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 1.0.1
    • 1.1.0
    • SQL
    • None

    Description

      I'm trying the jdbc server

      I found that the jdbc server always occupies all cores in the cluster

      the reason is that when creating HiveContext, it doesn't set anything related to spark.cores.max or spark.executor.memory

      SparkSQLEnv.scala(https://github.com/apache/spark/blob/8032fe2fae3ac40a02c6018c52e76584a14b3438/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLEnv.scala) L41-L43

      liancheng

      Attachments

        Issue Links

          Activity

            People

              lian cheng Cheng Lian
              codingcat Nan Zhu
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: