Uploaded image for project: 'Phoenix'
  1. Phoenix
  2. PHOENIX-2290

Spark Phoenix cannot recognize Phoenix view fields

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 4.5.1
    • None
    • None

    Description

      I created base table in base shell:

      create 'test_table',  {NAME => 'cf1', VERSIONS => 1}
      put 'test_table', 'row_key_1', 'cf1:col_1', '200'
      

      This is a very simple table. then create phoenix view in Phoenix shell.

      create view "test_table" (pk varchar primary key, "cf1"."col_1" varchar)
      

      then do following in Spark shell:

      val df = sqlContext.load("org.apache.phoenix.spark", Map("table" -> "\"test_table\"",  "zkUrl" -> "localhost:2181"))
      df.registerTempTable("temp")
      
      scala> df.printSchema
      root
       |-- PK: string (nullable = true)
       |-- col_1: string (nullable = true)
      

      sqlContext.sql("select * from temp") ------> This does work

      then:

      sqlContext.sql("select * from temp where col_1='200' ")
      
      java.lang.RuntimeException: org.apache.phoenix.schema.ColumnNotFoundException: ERROR 504 (42703): Undefined column. columnName=col_1
      	at org.apache.phoenix.mapreduce.PhoenixInputFormat.getQueryPlan(PhoenixInputFormat.java:125)
      	at org.apache.phoenix.mapreduce.PhoenixInputFormat.getSplits(PhoenixInputFormat.java:80)
      	at org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:95)
      	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
      	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
      	at scala.Option.getOrElse(Option.scala:120)
      	at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
      	at org.apache.phoenix.spark.PhoenixRDD.getPartitions(PhoenixRDD.scala:47)
      	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
      	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
      	at scala.Option.getOrElse(Option.scala:120)
      


      I also tried:

      sqlContext.sql("select * from temp where \"col_1\"='200' ")  --> EMPTY result, no exception
      
      sqlContext.sql("select * from temp where \"cf1\".\"col_1\"='200' ")  --> exception, cannot recognize SQL
      

      Attachments

        Issue Links

          Activity

            People

              jmahonin Josh Mahonin
              azuryy Fengdong Yu
              Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

              Dates

                Created:
                Updated: