Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
None
-
None
-
None
Description
Ambari-2.1.0 for Dal is putting a lot more properties in /etc/spark/conf/hive-
site.xml than desired. Its leading to unnecessary exceptions while trying to
load HiveContext on Spark shell. Here is the error:
15/04/21 08:37:44 INFO ParseDriver: Parsing command: show tables
15/04/21 08:37:44 INFO ParseDriver: Parse Completed
java.lang.RuntimeException: java.lang.NumberFormatException: For input string: "5s"
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
at org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:237)
at org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:233)
at scala.Option.orElse(Option.scala:257)
at org.apache.spark.sql.hive.HiveContext.x$3$lzycompute(HiveContext.scala:233)
at org.apache.spark.sql.hive.HiveContext.x$3(HiveContext.scala:231)
at org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext.scala:231)
at org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:231)
at org.apache.spark.sql.hive.HiveMetastoreCatalog.<init>(HiveMetastoreCatalog.scala:56)
at org.apache.spark.sql.hive.HiveContext$$anon$2.<init>(HiveContext.scala:255)
at org.apache.spark.sql.hive.HiveContext.catalog$lzycompute(HiveContext.scala:255)
at org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:255)
at org.apache.spark.sql.hive.HiveContext$$anon$4.<init>(HiveContext.scala:265)
....
In previous Ambari release we were adding only a handful of properties (< 10)
now 150+ (attached), we should revert to the old behavior.
Attachments
Issue Links
- links to