Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
1.13.0
-
None
Description
The query returns wrong results when convert_from function and additional columns may appear in new batches. Here is a simple test case for this issue:
@Test public void testConvertFromJson() throws Exception { String fileName = "table.tsv"; try (BufferedWriter writer = new BufferedWriter(new FileWriter(new File(dirTestWatcher.getRootDir(), fileName)))) { for (int i = 0; i < JSONRecordReader.DEFAULT_ROWS_PER_BATCH; i++) { writer.write("{\"id\":\"1\"}\n"); } writer.write("{\"id\":\"2\",\"v\":[\"abc\"]}"); } String sql = "SELECT t.m.id AS id, t.m.v[0] v FROM \n" + "(SELECT convert_from(columns[0], 'json') AS m FROM dfs.`%s`) t\n" + "where t.m.id='2'"; testBuilder() .sqlQuery(sql, fileName) .unOrdered() .baselineColumns("id", "v") .baselineValues("2", "abc") .go(); }
Currently, theĀ "v" column is null since OK_NEW_SCHEMA wasn't returned from the project when the complex writer was used.