Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-30201

HiveOutputWriter standardOI should use ObjectInspectorCopyOption.DEFAULT

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Critical
    • Resolution: Fixed
    • 2.0.2, 2.1.3, 2.2.3, 2.3.4, 2.4.7, 3.0.0
    • 2.4.8, 3.0.0
    • SQL

    Description

      Now spark use `ObjectInspectorCopyOption.JAVA` as oi option which will convert any string to UTF-8 string. When write non UTF-8 code data, then `EFBFBD` will appear.
      We should use `ObjectInspectorCopyOption.DEFAULT` to support pass the bytes.

      Here is the way to reproduce:
      1. make a file contains 16 radix 'AABBCC' which is not the UTF-8 code.
      2. create table test1 (c string) location '$file_path';
      3. select hex(c) from test1; // AABBCC
      4. craete table test2 (c string) as select c from test1;
      5. select hex(c) from test2; // EFBFBDEFBFBDEFBFBD

      Attachments

        Activity

          People

            ulysses XiDuo You
            ulysses XiDuo You
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: