2022-11-28 14:14:48,832 INFO [Scheduler 951526904 Job 92382969-ff7a-4938-8812-2e18d4c91a60-388] job.NSparkExecutable:41 : log4j: Category org.apache.kylin set to DEBUG 2022-11-28 14:14:48,832 INFO [Scheduler 951526904 Job 92382969-ff7a-4938-8812-2e18d4c91a60-388] job.NSparkExecutable:41 : log4j: Handling log4j.additivity.org.apache.kylin=[null] 2022-11-28 14:14:48,832 INFO [Scheduler 951526904 Job 92382969-ff7a-4938-8812-2e18d4c91a60-388] job.NSparkExecutable:41 : log4j: Finished configuring. 2022-11-28 14:15:09,809 INFO [FetcherRunner 215052362-37] threadpool.DefaultFetcherRunner:117 : Job Fetcher: 1 should running, 1 actual running, 0 stopped, 0 ready, 145 already succeed, 1 error, 0 discarded, 0 others 2022-11-28 14:15:23,223 DEBUG [http-bio-7070-exec-5] common.KylinConfig:363 : KYLIN_CONF property was not set, will seek KYLIN_HOME env variable 2022-11-28 14:15:23,224 INFO [http-bio-7070-exec-5] common.KylinConfig:369 : Use KYLIN_HOME=/data/softwares/kylin/kylin-4.0.1 2022-11-28 14:15:23,400 DEBUG [http-bio-7070-exec-4] badquery.BadQueryHistoryManager:65 : Loaded 0 Bad Query(s) 2022-11-28 14:15:25,692 DEBUG [http-bio-7070-exec-4] common.KylinConfig:363 : KYLIN_CONF property was not set, will seek KYLIN_HOME env variable 2022-11-28 14:15:25,692 INFO [http-bio-7070-exec-4] common.KylinConfig:369 : Use KYLIN_HOME=/data/softwares/kylin/kylin-4.0.1 2022-11-28 14:15:25,918 DEBUG [http-bio-7070-exec-6] badquery.BadQueryHistoryManager:65 : Loaded 0 Bad Query(s) 2022-11-28 14:15:36,842 DEBUG [BadQueryDetector] service.BadQueryDetector:148 : Detect bad query. 2022-11-28 14:15:39,809 INFO [FetcherRunner 215052362-37] threadpool.DefaultFetcherRunner:117 : Job Fetcher: 1 should running, 1 actual running, 0 stopped, 0 ready, 145 already succeed, 1 error, 0 discarded, 0 others 2022-11-28 14:16:09,809 INFO [FetcherRunner 215052362-37] threadpool.DefaultFetcherRunner:117 : Job Fetcher: 1 should running, 1 actual running, 0 stopped, 0 ready, 145 already succeed, 1 error, 0 discarded, 0 others 2022-11-28 14:16:20,707 DEBUG [http-bio-7070-exec-1] common.KylinConfig:363 : KYLIN_CONF property was not set, will seek KYLIN_HOME env variable 2022-11-28 14:16:20,708 INFO [http-bio-7070-exec-1] common.KylinConfig:369 : Use KYLIN_HOME=/data/softwares/kylin/kylin-4.0.1 2022-11-28 14:16:20,906 DEBUG [http-bio-7070-exec-8] badquery.BadQueryHistoryManager:65 : Loaded 0 Bad Query(s) 2022-11-28 14:16:36,842 DEBUG [BadQueryDetector] service.BadQueryDetector:148 : Detect bad query. 2022-11-28 14:16:39,809 INFO [FetcherRunner 215052362-37] threadpool.DefaultFetcherRunner:117 : Job Fetcher: 1 should running, 1 actual running, 0 stopped, 0 ready, 145 already succeed, 1 error, 0 discarded, 0 others 2022-11-28 14:17:09,809 INFO [FetcherRunner 215052362-37] threadpool.DefaultFetcherRunner:117 : Job Fetcher: 1 should running, 1 actual running, 0 stopped, 0 ready, 145 already succeed, 1 error, 0 discarded, 0 others 2022-11-28 14:17:10,029 INFO [Scheduler 951526904 Job 92382969-ff7a-4938-8812-2e18d4c91a60-388] job.NSparkExecutable:41 : log4j:WARN SparkDriverHdfsLogAppender flush log when shutdown ... 2022-11-28 14:17:10,186 INFO [Scheduler 951526904 Job 92382969-ff7a-4938-8812-2e18d4c91a60-388] execution.AbstractExecutable:417 : The state of job is:RUNNING 2022-11-28 14:17:10,191 INFO [Scheduler 951526904 Job 92382969-ff7a-4938-8812-2e18d4c91a60-388] execution.ExecutableManager:676 : job id:92382969-ff7a-4938-8812-2e18d4c91a60-01 from RUNNING to ERROR 2022-11-28 14:17:10,193 DEBUG [Scheduler 951526904 Job 92382969-ff7a-4938-8812-2e18d4c91a60-388] execution.ExecutableManager:698 : need kill 92382969-ff7a-4938-8812-2e18d4c91a60-01, from RUNNING to ERROR 2022-11-28 14:17:10,194 DEBUG [Scheduler 951526904 Job 92382969-ff7a-4938-8812-2e18d4c91a60-388] execution.ExecutableManager:721 : Update JobOutput To HDFS for 92382969-ff7a-4938-8812-2e18d4c91a60-01 to hdfs://data-platform-server-01:8020/kylin/kylin_metadata/demo/spark_logs/driver/92382969-ff7a-4938-8812-2e18d4c91a60-01/execute_output.json [5727] 2022-11-28 14:17:10,211 ERROR [Scheduler 951526904 Job 92382969-ff7a-4938-8812-2e18d4c91a60-388] execution.AbstractExecutable:210 : error running Executable: NSparkCubingJob{id=92382969-ff7a-4938-8812-2e18d4c91a60, name=BUILD CUBE - store_sku_sales_cube_demo - 20221101000000_20221102000000 - CST 2022-11-28 14:14:18, state=RUNNING} 2022-11-28 14:17:10,211 INFO [Scheduler 951526904 Job 92382969-ff7a-4938-8812-2e18d4c91a60-388] execution.AbstractExecutable:417 : The state of job is:RUNNING 2022-11-28 14:17:10,214 DEBUG [pool-3-thread-1] cachesync.Broadcaster:119 : Servers in the cluster: [localhost:7070] 2022-11-28 14:17:10,214 DEBUG [pool-3-thread-1] cachesync.Broadcaster:129 : Announcing new broadcast to all: BroadcastEvent{entity=execute_output, event=update, cacheKey=92382969-ff7a-4938-8812-2e18d4c91a60} 2022-11-28 14:17:10,215 INFO [Scheduler 951526904 Job 92382969-ff7a-4938-8812-2e18d4c91a60-388] execution.ExecutableManager:676 : job id:92382969-ff7a-4938-8812-2e18d4c91a60 from RUNNING to ERROR 2022-11-28 14:17:10,216 DEBUG [http-bio-7070-exec-5] cachesync.Broadcaster:267 : Broadcasting UPDATE, execute_output, 92382969-ff7a-4938-8812-2e18d4c91a60 2022-11-28 14:17:10,217 DEBUG [http-bio-7070-exec-5] cachesync.Broadcaster:301 : Done broadcasting UPDATE, execute_output, 92382969-ff7a-4938-8812-2e18d4c91a60 2022-11-28 14:17:10,218 DEBUG [Scheduler 951526904 Job 92382969-ff7a-4938-8812-2e18d4c91a60-388] execution.ExecutableManager:698 : need kill 92382969-ff7a-4938-8812-2e18d4c91a60, from RUNNING to ERROR 2022-11-28 14:17:10,218 DEBUG [pool-3-thread-1] cachesync.Broadcaster:119 : Servers in the cluster: [localhost:7070] 2022-11-28 14:17:10,218 DEBUG [Scheduler 951526904 Job 92382969-ff7a-4938-8812-2e18d4c91a60-388] execution.AbstractExecutable:365 : no need to send email, user list is empty 2022-11-28 14:17:10,218 DEBUG [pool-3-thread-1] cachesync.Broadcaster:129 : Announcing new broadcast to all: BroadcastEvent{entity=execute_output, event=update, cacheKey=92382969-ff7a-4938-8812-2e18d4c91a60} 2022-11-28 14:17:10,218 ERROR [pool-7-thread-8] threadpool.DefaultScheduler:115 : ExecuteException job:92382969-ff7a-4938-8812-2e18d4c91a60 org.apache.kylin.job.exception.ExecuteException: org.apache.kylin.job.exception.ExecuteException: java.io.IOException: OS command error exit with return code: 1, error message: log4j: Using URL [file:/data/softwares/kylin/kylin-4.0.1/conf/spark-driver-log4j.properties] for automatic log4j configuration. log4j: Reading configuration from URL file:/data/softwares/kylin/kylin-4.0.1/conf/spark-driver-log4j.properties log4j: Parsing for [root] with value=[INFO,hdfs]. log4j: Level token is [INFO]. log4j: Category root set to INFO log4j: Parsing appender named "hdfs". log4j: Parsing layout options for "hdfs". log4j: Setting property [conversionPattern] to [%d{ISO8601} %-5p [%t] %c{2} : %m%n]. log4j: End of parsing for "hdfs". log4j: Setting property [hdfsWorkingDir] to [hdfs://data-platform-server-01:8020/kylin/kylin_metadata/]. log4j: Setting property [kerberosPrincipal] to []. log4j: Setting property [logPath] to [hdfs://data-platform-server-01:8020/kylin/kylin_metadata/demo/spark_logs/driver/92382969-ff7a-4938-8812-2e18d4c91a60-01/execute_output.json.1669616088071.log]. log4j: Setting property [kerberosEnable] to [false]. log4j: Setting property [kerberosKeytab] to []. log4j: Setting property [logQueueCapacity] to [5000]. log4j: Setting property [flushInterval] to [5000]. log4j:WARN SparkDriverHdfsLogAppender starting ... log4j:WARN hdfsWorkingDir -> hdfs://data-platform-server-01:8020/kylin/kylin_metadata/ log4j:WARN spark.driver.log4j.appender.hdfs.File -> hdfs://data-platform-server-01:8020/kylin/kylin_metadata/demo/spark_logs/driver/92382969-ff7a-4938-8812-2e18d4c91a60-01/execute_output.json.1669616088071.log log4j:WARN kerberosEnable -> false log4j:WARN SparkDriverHdfsLogAppender started ... log4j: Parsed "hdfs" options. log4j: Parsing for [org.springframework] with value=[WARN]. log4j: Level token is [WARN]. log4j: Category org.springframework set to WARN log4j: Handling log4j.additivity.org.springframework=[null] log4j: Parsing for [org.apache.spark] with value=[WARN]. log4j: Level token is [WARN]. log4j: Category org.apache.spark set to WARN log4j: Handling log4j.additivity.org.apache.spark=[null] log4j: Parsing for [org.apache.kylin] with value=[DEBUG]. log4j: Level token is [DEBUG]. log4j: Category org.apache.kylin set to DEBUG log4j: Handling log4j.additivity.org.apache.kylin=[null] log4j: Finished configuring. log4j:WARN SparkDriverHdfsLogAppender flush log when shutdown ... The command is: export HADOOP_CONF_DIR=/data/softwares/kylin/kylin-4.0.1/hadoop_conf && /data/softwares/kylin/kylin-4.0.1/spark/bin/spark-submit --class org.apache.kylin.engine.spark.application.SparkEntry --conf 'spark.sql.hive.metastore.version=2.1.1' --conf 'spark.yarn.queue=default' --conf 'spark.history.fs.logDirectory=hdfs:///kylin/spark-history' --conf 'spark.driver.extraJavaOptions=-XX:+CrashOnOutOfMemoryError -Dlog4j.configuration=file:/data/softwares/kylin/kylin-4.0.1/conf/spark-driver-log4j.properties -Dkylin.kerberos.enabled=false -Dkylin.hdfs.working.dir=hdfs://data-platform-server-01:8020/kylin/kylin_metadata/ -Dspark.driver.log4j.appender.hdfs.File=hdfs://data-platform-server-01:8020/kylin/kylin_metadata/demo/spark_logs/driver/92382969-ff7a-4938-8812-2e18d4c91a60-01/execute_output.json.1669616088071.log -Dlog4j.debug=true -Dspark.driver.rest.server.address=data-platform-server-01:7070 -Dspark.driver.param.taskId=92382969-ff7a-4938-8812-2e18d4c91a60-01 -Dspark.driver.local.logDir=/data/softwares/kylin/kylin-4.0.1/logs/spark' --conf 'spark.master=yarn' --conf 'spark.executor.extraJavaOptions=-Dfile.encoding=UTF-8 -Dhdp.version=current -Dlog4j.configuration=spark-executor-log4j.properties -Dlog4j.debug -Dkylin.hdfs.working.dir=hdfs://data-platform-server-01:8020/kylin/kylin_metadata/ -Dkylin.metadata.identifier=kylin_metadata -Dkylin.spark.category=job -Dkylin.spark.project=demo -Dkylin.spark.identifier=92382969-ff7a-4938-8812-2e18d4c91a60 -Dkylin.spark.jobName=92382969-ff7a-4938-8812-2e18d4c91a60-01 -Duser.timezone=Asia/Shanghai' --conf 'spark.hadoop.yarn.timeline-service.enabled=false' --conf 'spark.eventLog.enabled=true' --conf 'spark.eventLog.dir=hdfs:///kylin/spark-history' --conf 'spark.sql.hive.metastore.jars=/opt/cloudera/parcels/CDH/lib/hive/lib/*' --conf 'spark.submit.deployMode=client' --conf 'spark.driver.memory=3072m' --conf 'spark.executor.extraClassPath=kylin-parquet-job-4.0.1.jar' --conf 'spark.driver.extraClassPath=/data/softwares/kylin/kylin-4.0.1/lib/kylin-parquet-job-4.0.1.jar' --files /data/softwares/kylin/apache-kylin-4.0.1-bin-spark3/conf/spark-executor-log4j.properties --name job_step_92382969-ff7a-4938-8812-2e18d4c91a60-01 --jars /data/softwares/kylin/kylin-4.0.1/lib/kylin-parquet-job-4.0.1.jar /data/softwares/kylin/kylin-4.0.1/lib/kylin-parquet-job-4.0.1.jar -className org.apache.kylin.engine.spark.job.CubeBuildJob hdfs://data-platform-server-01:8020/kylin/kylin_metadata/demo/job_tmp/92382969-ff7a-4938-8812-2e18d4c91a60-01_jobId at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:225) at org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:113) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: org.apache.kylin.job.exception.ExecuteException: java.io.IOException: OS command error exit with return code: 1, error message: log4j: Using URL [file:/data/softwares/kylin/kylin-4.0.1/conf/spark-driver-log4j.properties] for automatic log4j configuration. log4j: Reading configuration from URL file:/data/softwares/kylin/kylin-4.0.1/conf/spark-driver-log4j.properties log4j: Parsing for [root] with value=[INFO,hdfs]. log4j: Level token is [INFO]. log4j: Category root set to INFO log4j: Parsing appender named "hdfs". log4j: Parsing layout options for "hdfs". log4j: Setting property [conversionPattern] to [%d{ISO8601} %-5p [%t] %c{2} : %m%n]. log4j: End of parsing for "hdfs". log4j: Setting property [hdfsWorkingDir] to [hdfs://data-platform-server-01:8020/kylin/kylin_metadata/]. log4j: Setting property [kerberosPrincipal] to []. log4j: Setting property [logPath] to [hdfs://data-platform-server-01:8020/kylin/kylin_metadata/demo/spark_logs/driver/92382969-ff7a-4938-8812-2e18d4c91a60-01/execute_output.json.1669616088071.log]. log4j: Setting property [kerberosEnable] to [false]. log4j: Setting property [kerberosKeytab] to []. log4j: Setting property [logQueueCapacity] to [5000]. log4j: Setting property [flushInterval] to [5000]. log4j:WARN SparkDriverHdfsLogAppender starting ... log4j:WARN hdfsWorkingDir -> hdfs://data-platform-server-01:8020/kylin/kylin_metadata/ log4j:WARN spark.driver.log4j.appender.hdfs.File -> hdfs://data-platform-server-01:8020/kylin/kylin_metadata/demo/spark_logs/driver/92382969-ff7a-4938-8812-2e18d4c91a60-01/execute_output.json.1669616088071.log log4j:WARN kerberosEnable -> false log4j:WARN SparkDriverHdfsLogAppender started ... log4j: Parsed "hdfs" options. log4j: Parsing for [org.springframework] with value=[WARN]. log4j: Level token is [WARN]. log4j: Category org.springframework set to WARN log4j: Handling log4j.additivity.org.springframework=[null] log4j: Parsing for [org.apache.spark] with value=[WARN]. log4j: Level token is [WARN]. log4j: Category org.apache.spark set to WARN log4j: Handling log4j.additivity.org.apache.spark=[null] log4j: Parsing for [org.apache.kylin] with value=[DEBUG]. log4j: Level token is [DEBUG]. log4j: Category org.apache.kylin set to DEBUG log4j: Handling log4j.additivity.org.apache.kylin=[null] log4j: Finished configuring. log4j:WARN SparkDriverHdfsLogAppender flush log when shutdown ... The command is: export HADOOP_CONF_DIR=/data/softwares/kylin/kylin-4.0.1/hadoop_conf && /data/softwares/kylin/kylin-4.0.1/spark/bin/spark-submit --class org.apache.kylin.engine.spark.application.SparkEntry --conf 'spark.sql.hive.metastore.version=2.1.1' --conf 'spark.yarn.queue=default' --conf 'spark.history.fs.logDirectory=hdfs:///kylin/spark-history' --conf 'spark.driver.extraJavaOptions=-XX:+CrashOnOutOfMemoryError -Dlog4j.configuration=file:/data/softwares/kylin/kylin-4.0.1/conf/spark-driver-log4j.properties -Dkylin.kerberos.enabled=false -Dkylin.hdfs.working.dir=hdfs://data-platform-server-01:8020/kylin/kylin_metadata/ -Dspark.driver.log4j.appender.hdfs.File=hdfs://data-platform-server-01:8020/kylin/kylin_metadata/demo/spark_logs/driver/92382969-ff7a-4938-8812-2e18d4c91a60-01/execute_output.json.1669616088071.log -Dlog4j.debug=true -Dspark.driver.rest.server.address=data-platform-server-01:7070 -Dspark.driver.param.taskId=92382969-ff7a-4938-8812-2e18d4c91a60-01 -Dspark.driver.local.logDir=/data/softwares/kylin/kylin-4.0.1/logs/spark' --conf 'spark.master=yarn' --conf 'spark.executor.extraJavaOptions=-Dfile.encoding=UTF-8 -Dhdp.version=current -Dlog4j.configuration=spark-executor-log4j.properties -Dlog4j.debug -Dkylin.hdfs.working.dir=hdfs://data-platform-server-01:8020/kylin/kylin_metadata/ -Dkylin.metadata.identifier=kylin_metadata -Dkylin.spark.category=job -Dkylin.spark.project=demo -Dkylin.spark.identifier=92382969-ff7a-4938-8812-2e18d4c91a60 -Dkylin.spark.jobName=92382969-ff7a-4938-8812-2e18d4c91a60-01 -Duser.timezone=Asia/Shanghai' --conf 'spark.hadoop.yarn.timeline-service.enabled=false' --conf 'spark.eventLog.enabled=true' --conf 'spark.eventLog.dir=hdfs:///kylin/spark-history' --conf 'spark.sql.hive.metastore.jars=/opt/cloudera/parcels/CDH/lib/hive/lib/*' --conf 'spark.submit.deployMode=client' --conf 'spark.driver.memory=3072m' --conf 'spark.executor.extraClassPath=kylin-parquet-job-4.0.1.jar' --conf 'spark.driver.extraClassPath=/data/softwares/kylin/kylin-4.0.1/lib/kylin-parquet-job-4.0.1.jar' --files /data/softwares/kylin/apache-kylin-4.0.1-bin-spark3/conf/spark-executor-log4j.properties --name job_step_92382969-ff7a-4938-8812-2e18d4c91a60-01 --jars /data/softwares/kylin/kylin-4.0.1/lib/kylin-parquet-job-4.0.1.jar /data/softwares/kylin/kylin-4.0.1/lib/kylin-parquet-job-4.0.1.jar -className org.apache.kylin.engine.spark.job.CubeBuildJob hdfs://data-platform-server-01:8020/kylin/kylin_metadata/demo/job_tmp/92382969-ff7a-4938-8812-2e18d4c91a60-01_jobId at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:225) at org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:94) at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:206) ... 4 more Caused by: java.io.IOException: OS command error exit with return code: 1, error message: log4j: Using URL [file:/data/softwares/kylin/kylin-4.0.1/conf/spark-driver-log4j.properties] for automatic log4j configuration. log4j: Reading configuration from URL file:/data/softwares/kylin/kylin-4.0.1/conf/spark-driver-log4j.properties log4j: Parsing for [root] with value=[INFO,hdfs]. log4j: Level token is [INFO]. log4j: Category root set to INFO log4j: Parsing appender named "hdfs". log4j: Parsing layout options for "hdfs". log4j: Setting property [conversionPattern] to [%d{ISO8601} %-5p [%t] %c{2} : %m%n]. log4j: End of parsing for "hdfs". log4j: Setting property [hdfsWorkingDir] to [hdfs://data-platform-server-01:8020/kylin/kylin_metadata/]. log4j: Setting property [kerberosPrincipal] to []. log4j: Setting property [logPath] to [hdfs://data-platform-server-01:8020/kylin/kylin_metadata/demo/spark_logs/driver/92382969-ff7a-4938-8812-2e18d4c91a60-01/execute_output.json.1669616088071.log]. log4j: Setting property [kerberosEnable] to [false]. log4j: Setting property [kerberosKeytab] to []. log4j: Setting property [logQueueCapacity] to [5000]. log4j: Setting property [flushInterval] to [5000]. log4j:WARN SparkDriverHdfsLogAppender starting ... log4j:WARN hdfsWorkingDir -> hdfs://data-platform-server-01:8020/kylin/kylin_metadata/ log4j:WARN spark.driver.log4j.appender.hdfs.File -> hdfs://data-platform-server-01:8020/kylin/kylin_metadata/demo/spark_logs/driver/92382969-ff7a-4938-8812-2e18d4c91a60-01/execute_output.json.1669616088071.log log4j:WARN kerberosEnable -> false log4j:WARN SparkDriverHdfsLogAppender started ... log4j: Parsed "hdfs" options. log4j: Parsing for [org.springframework] with value=[WARN]. log4j: Level token is [WARN]. log4j: Category org.springframework set to WARN log4j: Handling log4j.additivity.org.springframework=[null] log4j: Parsing for [org.apache.spark] with value=[WARN]. log4j: Level token is [WARN]. log4j: Category org.apache.spark set to WARN log4j: Handling log4j.additivity.org.apache.spark=[null] log4j: Parsing for [org.apache.kylin] with value=[DEBUG]. log4j: Level token is [DEBUG]. log4j: Category org.apache.kylin set to DEBUG log4j: Handling log4j.additivity.org.apache.kylin=[null] log4j: Finished configuring. log4j:WARN SparkDriverHdfsLogAppender flush log when shutdown ... The command is: export HADOOP_CONF_DIR=/data/softwares/kylin/kylin-4.0.1/hadoop_conf && /data/softwares/kylin/kylin-4.0.1/spark/bin/spark-submit --class org.apache.kylin.engine.spark.application.SparkEntry --conf 'spark.sql.hive.metastore.version=2.1.1' --conf 'spark.yarn.queue=default' --conf 'spark.history.fs.logDirectory=hdfs:///kylin/spark-history' --conf 'spark.driver.extraJavaOptions=-XX:+CrashOnOutOfMemoryError -Dlog4j.configuration=file:/data/softwares/kylin/kylin-4.0.1/conf/spark-driver-log4j.properties -Dkylin.kerberos.enabled=false -Dkylin.hdfs.working.dir=hdfs://data-platform-server-01:8020/kylin/kylin_metadata/ -Dspark.driver.log4j.appender.hdfs.File=hdfs://data-platform-server-01:8020/kylin/kylin_metadata/demo/spark_logs/driver/92382969-ff7a-4938-8812-2e18d4c91a60-01/execute_output.json.1669616088071.log -Dlog4j.debug=true -Dspark.driver.rest.server.address=data-platform-server-01:7070 -Dspark.driver.param.taskId=92382969-ff7a-4938-8812-2e18d4c91a60-01 -Dspark.driver.local.logDir=/data/softwares/kylin/kylin-4.0.1/logs/spark' --conf 'spark.master=yarn' --conf 'spark.executor.extraJavaOptions=-Dfile.encoding=UTF-8 -Dhdp.version=current -Dlog4j.configuration=spark-executor-log4j.properties -Dlog4j.debug -Dkylin.hdfs.working.dir=hdfs://data-platform-server-01:8020/kylin/kylin_metadata/ -Dkylin.metadata.identifier=kylin_metadata -Dkylin.spark.category=job -Dkylin.spark.project=demo -Dkylin.spark.identifier=92382969-ff7a-4938-8812-2e18d4c91a60 -Dkylin.spark.jobName=92382969-ff7a-4938-8812-2e18d4c91a60-01 -Duser.timezone=Asia/Shanghai' --conf 'spark.hadoop.yarn.timeline-service.enabled=false' --conf 'spark.eventLog.enabled=true' --conf 'spark.eventLog.dir=hdfs:///kylin/spark-history' --conf 'spark.sql.hive.metastore.jars=/opt/cloudera/parcels/CDH/lib/hive/lib/*' --conf 'spark.submit.deployMode=client' --conf 'spark.driver.memory=3072m' --conf 'spark.executor.extraClassPath=kylin-parquet-job-4.0.1.jar' --conf 'spark.driver.extraClassPath=/data/softwares/kylin/kylin-4.0.1/lib/kylin-parquet-job-4.0.1.jar' --files /data/softwares/kylin/apache-kylin-4.0.1-bin-spark3/conf/spark-executor-log4j.properties --name job_step_92382969-ff7a-4938-8812-2e18d4c91a60-01 --jars /data/softwares/kylin/kylin-4.0.1/lib/kylin-parquet-job-4.0.1.jar /data/softwares/kylin/kylin-4.0.1/lib/kylin-parquet-job-4.0.1.jar -className org.apache.kylin.engine.spark.job.CubeBuildJob hdfs://data-platform-server-01:8020/kylin/kylin_metadata/demo/job_tmp/92382969-ff7a-4938-8812-2e18d4c91a60-01_jobId at org.apache.kylin.common.util.CliCommandExecutor.execute(CliCommandExecutor.java:98) at org.apache.kylin.engine.spark.job.NSparkExecutable.runSparkSubmit(NSparkExecutable.java:282) at org.apache.kylin.engine.spark.job.NSparkExecutable.doWork(NSparkExecutable.java:168) at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:206) ... 6 more 2022-11-28 14:17:10,219 INFO [FetcherRunner 215052362-37] threadpool.DefaultFetcherRunner:117 : Job Fetcher: 0 should running, 0 actual running, 0 stopped, 0 ready, 145 already succeed, 2 error, 0 discarded, 0 others 2022-11-28 14:17:10,220 DEBUG [http-bio-7070-exec-5] cachesync.Broadcaster:267 : Broadcasting UPDATE, execute_output, 92382969-ff7a-4938-8812-2e18d4c91a60 2022-11-28 14:17:10,220 DEBUG [http-bio-7070-exec-5] cachesync.Broadcaster:301 : Done broadcasting UPDATE, execute_output, 92382969-ff7a-4938-8812-2e18d4c91a60