When Spark table has java.sql.Date type column, Livy can't handle the java.sql.Date type correctly. e.g
create table test( name string, birthday date ); insert into test values ('Livy', '2019-07-24') curl -H "Content-Type:application/json" -X POST -d '{"code":"select * from test", "kind":"sql"}' 192.168.1.6:8998/sessions/48/statements {"id":1,"code":"select * from test","state":"waiting","output":null,"progress":0.0} curl 192.168.1.6:8998/sessions/48/statements/1 {"id":1,"code":"select * from test","state":"available","output":{"status":"ok","execution_count":1,"data":{"application/json":{"schema":{"type":"struct","fields":[{"name":"name","type":"string","nullable":true,"metadata":{}},{"name":"birthday","type":"date","nullable":true,"metadata":{}}]},"data":[["Livy",{}]]}}},"progress":1.0}
as you can see, the output of `select * from test` is ["Livy",{}], birthday column's value isn't handle correctly.
The reason is that json4j can't handle java.sql.Date, so we should define the CustomSerializer for java.sql.Date.
- is duplicated by
-
LIVY-683 Livy SQLInterpreter get empty array when extract date format rows to json
-
- Resolved
-
- links to