Details
-
Bug
-
Status: Open
-
Major
-
Resolution: Unresolved
-
0.8.0
-
None
-
None
Description
various foreign language data (i.e. japanese, thai etc) is loaded into string columns via tab delimited text files. A simple projection of the columns in the table is not displaying the correct data. Exporting the data from Hive and looking at the files implies the data is loaded properly. it appears to be an encoding issue at the driver but unaware of any required URL connection properties re encoding that Hive JDBC requires.
create table if not exists CERT.TLJA_JP_E ( RNUM int , C1 string, ORD int)
row format delimited
fields terminated by '\t'
stored as textfile;
create table if not exists CERT.TLJA_JP ( RNUM int , C1 string, ORD int)
stored as sequencefile;
load data local inpath '/home/hadoopadmin/jdbc-cert/CERT/CERT.TLJA_JP.txt'
overwrite into table CERT.TLJA_JP_E;
insert overwrite table CERT.TLJA_JP select * from CERT.TLJA_JP_E;
Attachments
Attachments
Issue Links
- relates to
-
HIVE-7511 Hive: output is incorrect if there are UTF-8 characters in where clause of a hive select query.
- Reopened