Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-32108

Silent mode of spark-sql is broken

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Not A Problem
    • 3.0.0
    • None
    • SQL
    • None

    Description

      1. I download the recent release Spark 3.0 from http://spark.apache.org/downloads.html
      2. Run bin/spark-sql -S, it prints a lot of INFO

      ➜  ~ ./spark-3.0/bin/spark-sql -S
      20/06/26 20:43:38 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
      log4j:WARN No appenders could be found for logger (org.apache.hadoop.hive.conf.HiveConf).
      log4j:WARN Please initialize the log4j system properly.
      log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
      Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
      20/06/26 20:43:39 INFO SharedState: spark.sql.warehouse.dir is not set, but hive.metastore.warehouse.dir is set. Setting spark.sql.warehouse.dir to the value of hive.metastore.warehouse.dir ('/user/hive/warehouse').
      20/06/26 20:43:39 INFO SharedState: Warehouse path is '/user/hive/warehouse'.
      20/06/26 20:43:39 INFO SessionState: Created HDFS directory: /tmp/hive/maximgekk/a47e882c-86a3-42b9-b43f-9dab0dd8492a
      20/06/26 20:43:39 INFO SessionState: Created local directory: /var/folders/p3/dfs6mf655d7fnjrsjvldh0tc0000gn/T/maximgekk/a47e882c-86a3-42b9-b43f-9dab0dd8492a
      20/06/26 20:43:39 INFO SessionState: Created HDFS directory: /tmp/hive/maximgekk/a47e882c-86a3-42b9-b43f-9dab0dd8492a/_tmp_space.db
      20/06/26 20:43:39 INFO SparkContext: Running Spark version 3.0.0
      20/06/26 20:43:39 INFO ResourceUtils: ==============================================================
      20/06/26 20:43:39 INFO ResourceUtils: Resources for spark.driver:
      
      20/06/26 20:43:39 INFO ResourceUtils: ==============================================================
      20/06/26 20:43:39 INFO SparkContext: Submitted application: SparkSQL::192.168.1.78
      20/06/26 20:43:39 INFO SecurityManager: Changing view acls to: maximgekk
      20/06/26 20:43:39 INFO SecurityManager: Changing modify acls to: maximgekk
      20/06/26 20:43:39 INFO SecurityManager: Changing view acls groups to:
      20/06/26 20:43:39 INFO SecurityManager: Changing modify acls groups to:
      20/06/26 20:43:39 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(maximgekk); groups with view permissions: Set(); users  with modify permissions: Set(maximgekk); groups with modify permissions: Set()
      20/06/26 20:43:39 INFO Utils: Successfully started service 'sparkDriver' on port 59414.
      20/06/26 20:43:39 INFO SparkEnv: Registering MapOutputTracker
      20/06/26 20:43:39 INFO SparkEnv: Registering BlockManagerMaster
      20/06/26 20:43:39 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
      20/06/26 20:43:39 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
      20/06/26 20:43:39 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
      20/06/26 20:43:39 INFO DiskBlockManager: Created local directory at /private/var/folders/p3/dfs6mf655d7fnjrsjvldh0tc0000gn/T/blockmgr-c1d041ad-dd46-4d11-bbd0-e8ba27d3bf69
      20/06/26 20:43:39 INFO MemoryStore: MemoryStore started with capacity 408.9 MiB
      20/06/26 20:43:39 INFO SparkEnv: Registering OutputCommitCoordinator
      20/06/26 20:43:40 INFO Utils: Successfully started service 'SparkUI' on port 4040.
      20/06/26 20:43:40 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.1.78:4040
      20/06/26 20:43:40 INFO Executor: Starting executor ID driver on host 192.168.1.78
      20/06/26 20:43:40 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 59415.
      20/06/26 20:43:40 INFO NettyBlockTransferService: Server created on 192.168.1.78:59415
      20/06/26 20:43:40 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
      20/06/26 20:43:40 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.1.78, 59415, None)
      20/06/26 20:43:40 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.78:59415 with 408.9 MiB RAM, BlockManagerId(driver, 192.168.1.78, 59415, None)
      20/06/26 20:43:40 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.1.78, 59415, None)
      20/06/26 20:43:40 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.1.78, 59415, None)
      20/06/26 20:43:40 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/Users/maximgekk/spark-warehouse/').
      20/06/26 20:43:40 INFO SharedState: Warehouse path is 'file:/Users/maximgekk/spark-warehouse/'.
      20/06/26 20:43:40 INFO HiveUtils: Initializing HiveMetastoreConnection version 2.3.7 using Spark classes.
      20/06/26 20:43:40 INFO HiveClientImpl: Warehouse location for Hive client (version 2.3.7) is file:/Users/maximgekk/spark-warehouse/
      20/06/26 20:43:41 WARN HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
      20/06/26 20:43:41 WARN HiveConf: HiveConf of name hive.stats.retries.wait does not exist
      20/06/26 20:43:41 INFO HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
      20/06/26 20:43:41 INFO ObjectStore: ObjectStore, initialize called
      20/06/26 20:43:41 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
      20/06/26 20:43:41 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
      20/06/26 20:43:42 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
      20/06/26 20:43:42 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
      20/06/26 20:43:42 INFO ObjectStore: Initialized ObjectStore
      20/06/26 20:43:42 ERROR ObjectStore: Version information found in metastore differs 1.2.0 from expected schema version 2.3.0. Schema verififcation is disabled hive.metastore.schema.verification
      20/06/26 20:43:42 WARN ObjectStore: setMetaStoreSchemaVersion called but recording version is disabled: version = 2.3.0, comment = Set by MetaStore maximgekk@192.168.1.78
      20/06/26 20:43:42 INFO HiveMetaStore: Added admin role in metastore
      20/06/26 20:43:43 INFO HiveMetaStore: Added public role in metastore
      20/06/26 20:43:43 INFO HiveMetaStore: No user is added in admin role, since config is empty
      20/06/26 20:43:43 INFO HiveMetaStore: 0: get_all_functions
      20/06/26 20:43:43 INFO audit: ugi=maximgekk	ip=unknown-ip-addr	cmd=get_all_functions
      20/06/26 20:43:43 INFO HiveMetaStore: 0: get_database: default
      20/06/26 20:43:43 INFO audit: ugi=maximgekk	ip=unknown-ip-addr	cmd=get_database: default
      

      but it should output logs up to WARN:
      https://github.com/apache/spark/blob/178ca961fe2b27a821b55d2231532804f1d8b68f/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala#L323

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              maxgekk Max Gekk
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: