Details
-
Bug
-
Status: Open
-
Major
-
Resolution: Unresolved
-
v4.0.1
-
None
-
None
-
CDH 6.3.2;Hadoop 3.0.0;Hive 2.1.1;Spark 2.4.0
Description
I tried it twice in total:
For the first time, I added SPARK_HOME in /etc/profile , and then when I started kylin, it prompted "Skip spark which not owned by kylin. SPARK_HOME is /opt/cloudera/parcels/CDH/lib/spark and KYLIN_HOME is /opt/kylin.
Please download the correct version of Apache Spark, unzip it, rename it to'spark' and put it in /opt/kylin directory.
Do not use the spark that comes with your hadoop environment."
So I deleted SPARK_HOME from the environment variable, then used "$KYLIN_HOME/bin/download-spark.sh" to download "spark-2.4.7-bin-hadoop2.7.tgz" and automatically unzip it under $KYLIN_HOME/spark, and started kylin again, and it prompted "Failed to take action" after I Load Table From Tree, and the log has the following error:
"ERROR [http-bio-7070-exec-6] controller.TableController:199: HIVE_STATS_JDBC_TIMEOUT
java.lang.NoSuchFieldError: HIVE_STATS_JDBC_TIMEOUT". So what should I do now, spark under $KYLIN_HOME needs to modify the configuration or how to do it?
Attach the error part of the log