Uploaded image for project: 'CarbonData'
  1. CarbonData
  2. CARBONDATA-2079

Error displays while executing minor compaction on the cluster

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Fixed
    • 1.3.0
    • None
    • data-load
    • None
    • spark 2.1

    Description

      Error displays while executing minor compaction on the cluster:

      Steps to Reproduce:

      1) Create Table:

      CREATE TABLE uniqdata_batchsort_compact (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'carbondata' TBLPROPERTIES('SORT_SCOPE'='BATCH_SORT')

      2) Load Data :

      LOAD DATA INPATH 'HDFS_URL/BabuStore/Data/uniqdata/7000_UniqData.csv' into table uniqdata_batchsort_compact OPTIONS('DELIMITER'=',' , 'QUOTECHAR'='"','BAD_RECORDS_ACTION'='FORCE','FILEHEADER'='CUST_ID,CUST_NAME,ACTIVE_EMUI_VERSION,DOB,DOJ,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1','batch_sort_size_inmb'='1')

      LOAD DATA INPATH 'HDFS_URL/BabuStore/Data/uniqdata/7000_UniqData.csv' into table uniqdata_batchsort_compact OPTIONS('DELIMITER'=',' , 'QUOTECHAR'='"','BAD_RECORDS_ACTION'='FORCE','FILEHEADER'='CUST_ID,CUST_NAME,ACTIVE_EMUI_VERSION,DOB,DOJ,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1','batch_sort_size_inmb'='1')

      LOAD DATA INPATH 'HDFS_URL/BabuStore/Data/uniqdata/7000_UniqData.csv' into table uniqdata_batchsort_compact OPTIONS('DELIMITER'=',' , 'QUOTECHAR'='"','BAD_RECORDS_ACTION'='FORCE','FILEHEADER'='CUST_ID,CUST_NAME,ACTIVE_EMUI_VERSION,DOB,DOJ,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1','batch_sort_size_inmb'='1')

      LOAD DATA INPATH 'HDFS_URL/BabuStore/Data/uniqdata/7000_UniqData.csv' into table uniqdata_batchsort_compact OPTIONS('DELIMITER'=',' , 'QUOTECHAR'='"','BAD_RECORDS_ACTION'='FORCE','FILEHEADER'='CUST_ID,CUST_NAME,ACTIVE_EMUI_VERSION,DOB,DOJ,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1','batch_sort_size_inmb'='1')

      3)Execute Queries:

      alter table uniqdata_batchsort_compact compact 'minor'

      output:

      Compaction failed. Please check logs for more info. Exception in compaction / by zero

      logs:

      [exec] 18/01/24 16:47:33 INFO SelectQuery: Executing Query: alter table uniqdata_batchsort_compact compact 'minor'
      [exec] 18/01/24 16:47:33 INFO CarbonSparkSqlParser: Parsing command: alter table uniqdata_batchsort_compact compact 'minor'
      [exec] 18/01/24 16:47:33 INFO CarbonLateDecodeRule: main skip CarbonOptimizer
      [exec] 18/01/24 16:47:33 INFO CarbonLateDecodeRule: main Skip CarbonOptimizer
      [exec] 18/01/24 16:47:33 INFO HiveMetaStore: 0: get_table : db=default tbl=uniqdata_batchsort_compact
      [exec] 18/01/24 16:47:33 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_table : db=default tbl=uniqdata_batchsort_compact
      [exec] 18/01/24 16:47:33 INFO CatalystSqlParser: Parsing command: array<string>
      [exec] 18/01/24 16:47:33 INFO HiveMetaStore: 0: get_table : db=default tbl=uniqdata_batchsort_compact
      [exec] 18/01/24 16:47:33 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_table : db=default tbl=uniqdata_batchsort_compact
      [exec] 18/01/24 16:47:33 INFO CatalystSqlParser: Parsing command: array<string>
      [exec] 18/01/24 16:47:33 AUDIT CarbonAlterTableCompactionCommand: [hadoop-master][root][Thread-1]Compaction request received for table default.uniqdata_batchsort_compact
      [exec] 18/01/24 16:47:33 INFO HdfsFileLock: main HDFS lock path:hdfs://hadoop-master:54311//opt/CarbonStore/default/uniqdata_batchsort_compact/compaction.lock
      [exec] 18/01/24 16:47:34 INFO CarbonAlterTableCompactionCommand: main Acquired the compaction lock for table default.uniqdata_batchsort_compact
      [exec] 18/01/24 16:47:34 INFO CarbonTableCompactor: main loads identified for merge is 0
      [exec] 18/01/24 16:47:34 INFO CarbonTableCompactor: main loads identified for merge is 1
      [exec] 18/01/24 16:47:34 INFO CarbonTableCompactor: main loads identified for merge is 2
      [exec] 18/01/24 16:47:34 INFO CarbonTableCompactor: main loads identified for merge is 3
      [exec] 18/01/24 16:47:34 INFO CarbonTableCompactor: main spark.executor.instances property is set to = 3
      [exec] 18/01/24 16:47:34 INFO TableInfo: main Table block size not specified for default_uniqdata_batchsort_compact. Therefore considering the default value 1024 MB
      [exec] 18/01/24 16:47:34 INFO BlockletDataMap: main Time taken to load blocklet datamap from file : hdfs://hadoop-master:54311//opt/CarbonStore/default/uniqdata_batchsort_compact/Fact/Part0/Segment_0/0_batchno0-0-1516792651041.carbonindexis 0
      [exec] 18/01/24 16:47:34 INFO BlockletDataMap: main Time taken to load blocklet datamap from file : hdfs://hadoop-master:54311//opt/CarbonStore/default/uniqdata_batchsort_compact/Fact/Part0/Segment_1/0_batchno0-0-1516792651780.carbonindexis 1
      [exec] 18/01/24 16:47:34 INFO BlockletDataMap: main Time taken to load blocklet datamap from file : hdfs://hadoop-master:54311//opt/CarbonStore/default/uniqdata_batchsort_compact/Fact/Part0/Segment_2/0_batchno0-0-1516792652481.carbonindexis 0
      [exec] 18/01/24 16:47:34 INFO BlockletDataMap: main Time taken to load blocklet datamap from file : hdfs://hadoop-master:54311//opt/CarbonStore/default/uniqdata_batchsort_compact/Fact/Part0/Segment_3/0_batchno0-0-1516792653232.carbonindexis 0
      [exec] 18/01/24 16:47:34 ERROR CarbonTableCompactor: main Exception in compaction thread / by zero
      [exec] java.lang.ArithmeticException: / by zero
      [exec] at org.apache.carbondata.processing.util.CarbonLoaderUtil.nodeBlockMapping(CarbonLoaderUtil.java:524)
      [exec] at org.apache.carbondata.processing.util.CarbonLoaderUtil.nodeBlockMapping(CarbonLoaderUtil.java:453)
      [exec] at org.apache.carbondata.spark.rdd.CarbonMergerRDD.getPartitions(CarbonMergerRDD.scala:400)
      [exec] at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:252)
      [exec] at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:250)
      [exec] at scala.Option.getOrElse(Option.scala:121)
      [exec] at org.apache.spark.rdd.RDD.partitions(RDD.scala:250)
      [exec] at org.apache.spark.SparkContext.runJob(SparkContext.scala:1958)
      [exec] at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:935)
      [exec] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
      [exec] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
      [exec] at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
      [exec] at org.apache.spark.rdd.RDD.collect(RDD.scala:934)
      [exec] at org.apache.carbondata.spark.rdd.CarbonTableCompactor.triggerCompaction(CarbonTableCompactor.scala:211)
      [exec] at org.apache.carbondata.spark.rdd.CarbonTableCompactor.scanSegmentsAndSubmitJob(CarbonTableCompactor.scala:120)
      [exec] at org.apache.carbondata.spark.rdd.CarbonTableCompactor.executeCompaction(CarbonTableCompactor.scala:71)
      [exec] at org.apache.carbondata.spark.rdd.CarbonDataRDDFactory$$anon$2.run(CarbonDataRDDFactory.scala:182)
      [exec] at org.apache.carbondata.spark.rdd.CarbonDataRDDFactory$.startCompactionThreads(CarbonDataRDDFactory.scala:269)
      [exec] at org.apache.spark.sql.execution.command.management.CarbonAlterTableCompactionCommand.alterTableForCompaction(CarbonAlterTableCompactionCommand.scala:258)
      [exec] at org.apache.spark.sql.execution.command.management.CarbonAlterTableCompactionCommand.processData(CarbonAlterTableCompactionCommand.scala:111)
      [exec] at org.apache.spark.sql.execution.command.DataCommand.run(package.scala:71)
      [exec] at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
      [exec] at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
      [exec] at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
      [exec] at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
      [exec] at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
      [exec] at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
      [exec] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
      [exec] at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
      [exec] at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
      [exec] at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:87)
      [exec] at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:87)
      [exec] at org.apache.spark.sql.Dataset.<init>(Dataset.scala:185)
      [exec] at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
      [exec] at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:592)
      [exec] at com.huawei.spark.SessionManager.sql(SessionManager.java:42)
      [exec] at com.huawei.querymanagement.QueryManagement.sql(QueryManagement.java:62)
      [exec] at com.huawei.querymanagement.SelectQuery.testQuery(SelectQuery.java:70)
      [exec] at sun.reflect.GeneratedMethodAccessor66.invoke(Unknown Source)
      [exec] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      [exec] at java.lang.reflect.Method.invoke(Method.java:498)
      [exec] at org.junit.internal.runners.TestMethod.invoke(TestMethod.java:59)
      [exec] at org.junit.internal.runners.MethodRoadie.runTestMethod(MethodRoadie.java:98)
      [exec] at org.junit.internal.runners.MethodRoadie$2.run(MethodRoadie.java:79)
      [exec] at org.junit.internal.runners.MethodRoadie.runBeforesThenTestThenAfters(MethodRoadie.java:87)
      [exec] at org.junit.internal.runners.MethodRoadie.runTest(MethodRoadie.java:77)
      [exec] at org.junit.internal.runners.MethodRoadie.run(MethodRoadie.java:42)
      [exec] at org.junit.internal.runners.JUnit4ClassRunner.invokeTestMethod(JUnit4ClassRunner.java:88)
      [exec] at org.junit.internal.runners.JUnit4ClassRunner.runMethods(JUnit4ClassRunner.java:51)
      [exec] at org.junit.runners.Parameterized$TestClassRunnerForParameters.run(Parameterized.java:98)
      [exec] at org.junit.internal.runners.CompositeRunner.runChildren(CompositeRunner.java:33)
      [exec] at org.junit.runners.Parameterized.access$000(Parameterized.java:55)
      [exec] at org.junit.runners.Parameterized$1.run(Parameterized.java:131)
      [exec] at org.junit.internal.runners.ClassRoadie.runUnprotected(ClassRoadie.java:27)
      [exec] at org.junit.internal.runners.ClassRoadie.runProtected(ClassRoadie.java:37)
      [exec] at org.junit.runners.Parameterized.run(Parameterized.java:129)
      [exec] at org.junit.internal.runners.CompositeRunner.runChildren(CompositeRunner.java:33)
      [exec] at org.junit.internal.runners.CompositeRunner.run(CompositeRunner.java:28)
      [exec] at org.junit.runner.JUnitCore.run(JUnitCore.java:130)
      [exec] at org.junit.runner.JUnitCore.run(JUnitCore.java:109)
      [exec] at org.junit.runner.JUnitCore.run(JUnitCore.java:100)
      [exec] at org.junit.runner.JUnitCore.runClasses(JUnitCore.java:60)
      [exec] at com.huawei.querymanagement.SelectQuerySuite.main(SelectQuerySuite.java:18)
      [exec] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      [exec] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      [exec] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      [exec] at java.lang.reflect.Method.invoke(Method.java:498)
      [exec] at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
      [exec] at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
      [exec] at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
      [exec] at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
      [exec] at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      [exec] 18/01/24 16:47:34 ERROR CarbonDataRDDFactory$: main Exception in compaction thread / by zero
      [exec] 18/01/24 16:47:34 INFO HdfsFileLock: main Deleted the lock file hdfs://hadoop-master:54311//opt/CarbonStore/default/uniqdata_batchsort_compact/compaction.lock
      [exec] 18/01/24 16:47:34 ERROR CarbonAlterTableCompactionCommand: main Exception in start compaction thread. Exception in compaction / by zero
      [exec] 18/01/24 16:47:34 ERROR HdfsFileLock: main Not able to delete the lock file because it is not existed in location hdfs://hadoop-master:54311//opt/CarbonStore/default/uniqdata_batchsort_compact/compaction.lock
      [exec] 18/01/24 16:47:34 ERROR SelectQuery: An exception has occurred:
      [exec] org.apache.spark.sql.AnalysisException: Compaction failed. Please check logs for more info. Exception in compaction / by zero;
      [exec] at org.apache.spark.sql.util.CarbonException$.analysisException(CarbonException.scala:23)
      [exec] at org.apache.spark.sql.execution.command.management.CarbonAlterTableCompactionCommand.processData(CarbonAlterTableCompactionCommand.scala:120)
      [exec] at org.apache.spark.sql.execution.command.DataCommand.run(package.scala:71)
      [exec] at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
      [exec] at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
      [exec] at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
      [exec] at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
      [exec] at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
      [exec] at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
      [exec] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
      [exec] at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
      [exec] at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
      [exec] at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:87)
      [exec] at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:87)
      [exec] at org.apache.spark.sql.Dataset.<init>(Dataset.scala:185)
      [exec] at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
      [exec] at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:592)
      [exec] at com.huawei.spark.SessionManager.sql(SessionManager.java:42)
      [exec] at com.huawei.querymanagement.QueryManagement.sql(QueryManagement.java:62)
      [exec] at com.huawei.querymanagement.SelectQuery.testQuery(SelectQuery.java:70)
      [exec] at sun.reflect.GeneratedMethodAccessor66.invoke(Unknown Source)
      [exec] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      [exec] at java.lang.reflect.Method.invoke(Method.java:498)
      [exec] at org.junit.internal.runners.TestMethod.invoke(TestMethod.java:59)
      [exec] at org.junit.internal.runners.MethodRoadie.runTestMethod(MethodRoadie.java:98)
      [exec] at org.junit.internal.runners.MethodRoadie$2.run(MethodRoadie.java:79)
      [exec] at org.junit.internal.runners.MethodRoadie.runBeforesThenTestThenAfters(MethodRoadie.java:87)
      [exec] at org.junit.internal.runners.MethodRoadie.runTest(MethodRoadie.java:77)
      [exec] at org.junit.internal.runners.MethodRoadie.run(MethodRoadie.java:42)
      [exec] at org.junit.internal.runners.JUnit4ClassRunner.invokeTestMethod(JUnit4ClassRunner.java:88)
      [exec] at org.junit.internal.runners.JUnit4ClassRunner.runMethods(JUnit4ClassRunner.java:51)
      [exec] at org.junit.runners.Parameterized$TestClassRunnerForParameters.run(Parameterized.java:98)
      [exec] at org.junit.internal.runners.CompositeRunner.runChildren(CompositeRunner.java:33)
      [exec] at org.junit.runners.Parameterized.access$000(Parameterized.java:55)
      [exec] at org.junit.runners.Parameterized$1.run(Parameterized.java:131)
      [exec] at org.junit.internal.runners.ClassRoadie.runUnprotected(ClassRoadie.java:27)
      [exec] at org.junit.internal.runners.ClassRoadie.runProtected(ClassRoadie.java:37)
      [exec] at org.junit.runners.Parameterized.run(Parameterized.java:129)
      [exec] at org.junit.internal.runners.CompositeRunner.runChildren(CompositeRunner.java:33)
      [exec] at org.junit.internal.runners.CompositeRunner.run(CompositeRunner.java:28)
      [exec] at org.junit.runner.JUnitCore.run(JUnitCore.java:130)
      [exec] at org.junit.runner.JUnitCore.run(JUnitCore.java:109)
      [exec] at org.junit.runner.JUnitCore.run(JUnitCore.java:100)
      [exec] at org.junit.runner.JUnitCore.runClasses(JUnitCore.java:60)
      [exec] at com.huawei.querymanagement.SelectQuerySuite.main(SelectQuerySuite.java:18)
      [exec] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      [exec] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      [exec] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      [exec] at java.lang.reflect.Method.invoke(Method.java:498)
      [exec] at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
      [exec] at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
      [exec] at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
      [exec] at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
      [exec] at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

      Attachments

        1. 7000_UniqData.csv
          1.44 MB
          Vandana Yadav

        Activity

          People

            Unassigned Unassigned
            Vandana7 Vandana Yadav
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: