java.io.IOException: OS command error exit with return code: 1, error message: log4j: Using URL [file:/opt/kylin/conf/spark-driver-log4j.properties] for automatic log4j configuration. log4j: Reading configuration from URL file:/opt/kylin/conf/spark-driver-log4j.properties log4j: Parsing for [root] with value=[INFO,logFile]. log4j: Level token is [INFO]. log4j: Category root set to INFO log4j: Parsing appender named "logFile". log4j: Parsing layout options for "logFile". log4j: Setting property [conversionPattern] to [%d{ISO8601} %-5p [%t] %c{2} : %m%n]. log4j: End of parsing for "logFile". log4j: Setting property [file] to [/opt/kylin/logs/spark/05799056-b559-4f8c-8d7c-5db030229392-00.log]. log4j: Setting property [threshold] to [DEBUG]. log4j: setFile called: /opt/kylin/logs/spark/05799056-b559-4f8c-8d7c-5db030229392-00.log, true log4j: setFile ended log4j: Parsed "logFile" options. log4j: Parsing for [org.springframework] with value=[WARN]. log4j: Level token is [WARN]. log4j: Category org.springframework set to WARN log4j: Handling log4j.additivity.org.springframework=[null] log4j: Parsing for [org.apache.spark] with value=[WARN]. log4j: Level token is [WARN]. log4j: Category org.apache.spark set to WARN log4j: Handling log4j.additivity.org.apache.spark=[null] log4j: Parsing for [org.apache.kylin] with value=[DEBUG]. log4j: Level token is [DEBUG]. log4j: Category org.apache.kylin set to DEBUG log4j: Handling log4j.additivity.org.apache.kylin=[null] log4j: Finished configuring. The command is: export HADOOP_CONF_DIR=/opt/kylin/hadoop_conf && /opt/kylin/spark/bin/spark-submit --class org.apache.kylin.engine.spark.application.SparkEntry --conf 'spark.yarn.queue=default' --conf 'spark.history.fs.logDirectory=hdfs:///kylin/spark-history' --conf 'spark.driver.extraJavaOptions=-XX:+CrashOnOutOfMemoryError -Dlog4j.configuration=file:/opt/kylin/conf/spark-driver-log4j.properties -Dkylin.kerberos.enabled=false -Dkylin.hdfs.working.dir=hdfs://master:8020/kylin/kylin_metadata/ -Dspark.driver.log4j.appender.hdfs.File=hdfs://master:8020/kylin/kylin_metadata/PM01/spark_logs/driver/05799056-b559-4f8c-8d7c-5db030229392-00/execute_output.json.1641449376521.log -Dlog4j.debug=true -Dspark.driver.rest.server.address=master:7070 -Dspark.driver.param.taskId=05799056-b559-4f8c-8d7c-5db030229392-00 -Dspark.driver.local.logDir=/opt/kylin/logs/spark' --conf 'spark.master=local' --conf 'spark.hadoop.yarn.timeline-service.enabled=false' --conf 'spark.eventLog.enabled=true' --conf 'spark.eventLog.dir=hdfs:///kylin/spark-history' --conf 'spark.driver.memory=1G' --conf 'spark.sql.adaptive.enabled=false' --conf 'spark.sql.autoBroadcastJoinThreshold=-1' --conf 'spark.driver.extraClassPath=/opt/kylin/lib/kylin-parquet-job-4.0.1.jar' --name job_step_05799056-b559-4f8c-8d7c-5db030229392-00 --jars /opt/kylin/lib/kylin-parquet-job-4.0.1.jar /opt/kylin/lib/kylin-parquet-job-4.0.1.jar -className org.apache.kylin.engine.spark.job.ResourceDetectBeforeCubingJob hdfs://master:8020/kylin/kylin_metadata/PM01/job_tmp/05799056-b559-4f8c-8d7c-5db030229392-00_jobId at org.apache.kylin.common.util.CliCommandExecutor.execute(CliCommandExecutor.java:98) at org.apache.kylin.engine.spark.job.NSparkExecutable.runSparkSubmit(NSparkExecutable.java:282) at org.apache.kylin.engine.spark.job.NSparkExecutable.doWork(NSparkExecutable.java:168) at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:206) at org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:94) at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:206) at org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:113) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)