Uploaded image for project: 'CarbonData'
  1. CarbonData
  2. CARBONDATA-1048

Update Hive Guide

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Closed
    • Trivial
    • Resolution: Fixed
    • 1.0.0-incubating
    • None
    • hive-integration
    • None
    • hive 1.2.1,spark2.1

    Description

      Decimal data type raises exception while selecting the data from the table in hive with steps given in hive guide

      1) In Spark Shell :
      a) Create Table -
      import org.apache.spark.sql.SparkSession
      import org.apache.spark.sql.CarbonSession._
      val carbon = SparkSession.builder().config(sc.getConf).getOrCreateCarbonSession("hdfs://localhost:54310/opt/data")
      scala> carbon.sql(""" create table testHive1(id int,name string,scale decimal(10,0),country string,salary double) stored by'carbondata' """).show
      b) Load Data -
      scala> carbon.sql(""" load data inpath 'hdfs://localhost:54310/Files/testHive1.csv' into table testHive1 """ ).show
      2) In Hive :
      a) Add Jars -
      add jar /home/neha/incubator-carbondata/assembly/target/scala-2.11/carbondata_2.11-1.1.0-incubating-SNAPSHOT-shade-hadoop2.7.2.jar;
      add jar /opt/spark-2.1.0-bin-hadoop2.7/jars/spark-catalyst_2.11-2.1.0.jar;
      add jar /home/neha/incubator-carbondata/integration/hive/carbondata-hive-1.1.0-incubating-SNAPSHOT.jar;
      b) Create Table -
      create table testHive1(id int,name string,scale decimal(10,0),country string,salary double);
      c) Alter location -
      hive> alter table testHive1 set LOCATION 'hdfs://localhost:54310/opt/data/default/testhive1' ;
      d) Set Properties -
      set hive.mapred.supports.subdirectories=true;
      set mapreduce.input.fileinputformat.input.dir.recursive=true;
      d) Alter FileFormat -
      alter table testHive1 set FILEFORMAT
      INPUTFORMAT "org.apache.carbondata.hive.MapredCarbonInputFormat"
      OUTPUTFORMAT "org.apache.carbondata.hive.MapredCarbonOutputFormat"
      SERDE "org.apache.carbondata.hive.CarbonHiveSerDe";

      hive> ADD JAR /home/hduser/spark-2.1.0-bin-hadoop2.7/jars/spark-catalyst_2.11-2.1.0.jar;
      Added [/home/hduser/spark-2.1.0-bin-hadoop2.7/jars/spark-catalyst_2.11-2.1.0.jar] to class path
      Added resources: [/home/hduser/spark-2.1.0-bin-hadoop2.7/jars/spark-catalyst_2.11-2.1.0.jar]

      f) Execute Queries -
      select * from testHive1;
      3) Query :
      hive> select * from testHive1;

      Exception in thread "[main][partitionID:hive25;queryID:4537623368167]" java.lang.NoClassDefFoundError: scala/math/Ordered
      at java.lang.ClassLoader.defineClass1(Native Method)
      at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
      at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)

      when i add the scala-library and scala-reflect jar it works fine

      hive> ADD JAR /home/knoldus/Videos/scala-library-2.11.1.jar;
      Added [/home/knoldus/Videos/scala-library-2.11.1.jar] to class path
      Added resources: [/home/knoldus/Videos/scala-library-2.11.1.jar]
      hive> ADD JAR /home/knoldus/Videos/scala-reflect-2.11.1.jar;
      Added [/home/knoldus/Videos/scala-reflect-2.11.1.jar] to class path
      Added resources: [/home/knoldus/Videos/scala-reflect-2.11.1.jar]

      fired the query again

      hive> select * from testHive1;
      OK
      2 runlin 2 china 33000.2
      1 yuhai 2 china 33000.1

      so its better to mention about adding these jar in hive

      Attachments

        Issue Links

          Activity

            People

              anubhavtarar anubhav tarar
              anubhavtarar anubhav tarar
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved:

                Time Tracking

                  Estimated:
                  Original Estimate - Not Specified
                  Not Specified
                  Remaining:
                  Remaining Estimate - 0h
                  0h
                  Logged:
                  Time Spent - 1h 40m
                  1h 40m