Details
-
Improvement
-
Status: Closed
-
Major
-
Resolution: Won't Fix
-
2.1.0, 2.2.0
-
None
-
None
Description
When use yarn cluster mode,and we need scan hbase,there will be a case which can not work:
If we put user jar on hdfs,when local classpath will has no hbase,which will let get hbase token failed.Then later when job submitted to yarn, it will failed since has no token to access hbase table.I mock three cases:
1:user jar is on classpath, and has hbase
17/08/10 13:48:03 INFO security.HadoopFSDelegationTokenProvider: Renewal interval is 86400050 for token HDFS_DELEGATION_TOKEN
17/08/10 13:48:03 INFO security.HadoopDelegationTokenManager: Service hive
17/08/10 13:48:03 INFO security.HadoopDelegationTokenManager: Service hbase
17/08/10 13:48:05 INFO security.HBaseDelegationTokenProvider: Attempting to fetch HBase security token.
Logs showing we can get token normally.
2:user jar on hdfs
17/08/10 13:43:58 WARN security.HBaseDelegationTokenProvider: Class org.apache.hadoop.hbase.HBaseConfiguration not found. 17/08/10 13:43:58 INFO security.HBaseDelegationTokenProvider: Failed to get token from service hbase java.lang.ClassNotFoundException: org.apache.hadoop.hbase.security.token.TokenUtil at java.net.URLClassLoader$1.run(URLClassLoader.java:372) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:360) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.spark.deploy.security.HBaseDelegationTokenProvider.obtainDelegationTokens(HBaseDelegationTokenProvider.scala:41) at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$obtainDelegationTokens$2.apply(HadoopDelegationTokenManager.scala:112) at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$obtainDelegationTokens$2.apply(HadoopDelegationTokenManager.scala:109) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
Logs showing we can get token failed with ClassNotFoundException.
If we download user jar from remote first,then things will work correctly.
Attachments
Issue Links
- is related to
-
SPARK-21714 SparkSubmit in Yarn Client mode downloads remote files and then reuploads them again
- Resolved