Details
-
Bug
-
Status: Open
-
Minor
-
Resolution: Unresolved
-
1.6.0
-
None
-
None
-
Spark 2.1
Description
Steps :
Index server is running.
Create table and load data.
0: jdbc:hive2://10.19.91.221:22550/default> create table brinjal (imei string,AMSize string,channelsId string,ActiveCountry string, Activecity string,gamePointId double,deviceInformationId double,productionDate Timestamp,deliveryDate timestamp,deliverycharge double) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('table_blocksize'='1','SORT_SCOPE'='LOCAL_SORT','carbon.column.compressor'='zstd');
---------+
Result |
---------+
---------+
No rows selected (1.757 seconds)
0: jdbc:hive2://10.19.91.221:22550/default> LOAD DATA INPATH 'hdfs://hacluster/chetan/vardhandaterestruct.csv' INTO TABLE brinjal OPTIONS('DELIMITER'=',', 'QUOTECHAR'= '"','BAD_RECORDS_ACTION'='FORCE','FILEHEADER'= 'imei,deviceInformationId,AMSize,channelsId,ActiveCountry,Activecity,gamePointId,productionDate,deliveryDate,deliverycharge');
---------+
Result |
---------+
---------+
No rows selected (5.349 seconds)
Issue : Select filter query fails with file not found exception.
0: jdbc:hive2://10.19.91.221:22550/default> select * from brinjal where ActiveCountry ='Chinese' or channelsId =4;
Error: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 3061.0 failed 4 times, most recent failure: Lost task 0.3 in stage 3061.0 (TID 134228, linux-220, executor 1): java.io.FileNotFoundException: File does not exist: /user/hive/warehouse/carbon.store/1_6_0/brinjal/Fact/Part0/Segment_0/part-0-0_batchno0-0-0-1560934784938.carbondata
at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:74)
at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:64)
at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getBlockLocations(FSDirStatAndListingOp.java:648)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1736)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:712)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:402)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:973)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2260)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2256)