Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Invalid
-
1.6.0
-
None
Description
my operation is using spark to insert RDD data into Hbase like this:
------------------------------
localData.persist()
localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration)
--------------------------------------
this way throw Exception:
com.esotericsoftware.kryo.KryoException: java.lang.IndexOutOfBoundsException:index:99, Size:6
Serialization trace:
familyMap (org.apache.hadoop.hbase.client.Put)
at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:221)
at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729)
at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:42)
at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33)
at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729)
when i deal with this:
-----------------------------
localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration)
--------------------------------------
it works well,what the persist() method did?