Uploaded image for project: 'Phoenix'
  1. Phoenix
  2. PHOENIX-2426

Spark Data Source API Giving Exception

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Blocker
    • Resolution: Fixed
    • 4.4.0, 4.6.0
    • 4.7.0
    • None
    • None
    • CentOS 7.0

    • Important

    Description

      Table Definition:

      CREATE TABLE EVENT_FACT (
      TENANT_ID VARCHAR NOT NULL,
      EVENT_ID BIGINT,
      EVENT_KEY BIGINT NOT NULL,
      DATA_SOURCE_ID VARCHAR(64),
      DEVICE_TYPE1_KEY BIGINT,
      AUTHENTITYID BIGINT,
      ALARMFOREVENTS_ID BIGINT,
      SEVERITY_NUMBER SMALLINT,
      SEVERITY VARCHAR(20),
      NOTIFICATIONDATE DATE,
      NOTIFICATIONTIMESTAMP TIMESTAMP,
      DAY_IN_MONTH SMALLINT,
      MONTH_NUMBER SMALLINT,
      QUARTER_NUMBER SMALLINT,
      YEAR SMALLINT,
      WEEK_NUMBER SMALLINT,
      YEAR_FOR_WEEK SMALLINT,
      HOUR SMALLINT,
      MINUTE SMALLINT,
      SECOND SMALLINT,
      TIME_KEY INTEGER,
      CLASSNAME VARCHAR(255),
      CATEGORY VARCHAR(255),
      DISPLAYNAME VARCHAR(255),
      DESCRIPTION VARCHAR(1024),
      SOURCE VARCHAR(255),
      EVENTTYPE VARCHAR(255),
      NTTYADDRSS7_ADDRESS VARCHAR(100),
      GENERATEDBY VARCHAR(255),
      SRCOBJECTBUSINESSKEY VARCHAR(1024),
      SRCOBJECTDISPLAYNAME VARCHAR(255),
      OWNING_ENTITY VARCHAR(255),
      PRDCTSRS_VALUE VARCHAR(255),
      PRODUCTTYPE_VALUE VARCHAR(255),
      PRDCTFMLY_VALUE VARCHAR(100),
      SOFTWAREVERSION VARCHAR(100),
      IPADDRESS VARCHAR(50),
      DEVICENAME VARCHAR(255),
      COUNT SMALLINT,
      ELEMENTNAME VARCHAR(255),
      SRCOBJECTID BIGINT,
      CONSTRAINT PK PRIMARY KEY (TENANT_ID, EVENT_KEY)
      ) SALT_BUCKETS=4, COMPRESSION='GZ', VERSIONS=1 , IMMUTABLE_ROWS=true, MULTI_TENANT=true;

      Code:

      val df = sqlContext.load(
      "org.apache.phoenix.spark",
      Map("table" -> "EVENT_FACT", "zkUrl" -> "zookeeper:2181")
      )

      Exception:

      scala.MatchError: SMALLINT (of class org.apache.phoenix.schema.types.PSmallint)
      at org.apache.phoenix.spark.PhoenixRDD.phoenixTypeToCatalystType(PhoenixRDD.scala:134)
      at org.apache.phoenix.spark.PhoenixRDD$$anonfun$phoenixSchemaToCatalystSchema$1.apply(PhoenixRDD.scala:127)
      at org.apache.phoenix.spark.PhoenixRDD$$anonfun$phoenixSchemaToCatalystSchema$1.apply(PhoenixRDD.scala:126)
      at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
      at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
      at scala.collection.Iterator$class.foreach(Iterator.scala:742)
      at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
      at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
      at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
      at scala.collection.TraversableLike$class.map(TraversableLike.scala:245)
      at scala.collection.AbstractTraversable.map(Traversable.scala:104)
      at org.apache.phoenix.spark.PhoenixRDD.phoenixSchemaToCatalystSchema(PhoenixRDD.scala:126)
      at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:110)
      at org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelation.scala:57)
      at org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(LogicalRelation.scala:31)
      at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:120)
      at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1203)
      ... 56 elided

      Attachments

        1. PHOENIX-2426.patch
          3 kB
          Josh Mahonin
        2. PHOENIX-2426-v2.patch
          3 kB
          Josh Mahonin

        Activity

          People

            jmahonin Josh Mahonin
            gcagrici Gokhan Cagrici
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: