Description
Users of Spark SQL Datasets should be able to use objects that we don't understand (i.e. today not primitives or case classes) in Spark Datasets. To allow this, lets add a default encoder that uses kryo to store the class as bytes and presents it as a single field value.
For now, lets require the user to explicitly pass the encoder, but as a bonus we can explore the option to use this as a fallback automatically.
This encoder should be usable in:
- as
- map/flatMap, etc
- Custom Aggregators
- Scala/Java
Attachments
Issue Links
- links to