Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-27203

Spark Fails to read a view using CTE (WITH clause) and created via beeline

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Incomplete
    • 2.1.1
    • None
    • SQL

    Description

      Spark fails when trying to read a view which code involve CTE, and which is created via beeline.

      For example, considering the following view, created via Beeline:

      create view db.test as 
      with q1 as (select 1 as n)
      select n from q1
      

      When you do

      spark.sql("select * from db.test").show()
      

      The output is like

      'Table or view not found: q1; line 2 pos 14'
      Traceback (most recent call last):
        File "/DATA/fs11/hadoop/yarn/local/usercache/ingouagn/appcache/application_1552973526615_3878/container_e380_1552973526615_3878_01_000001/pyspark.zip/pyspark/sql/session.py", line 545, in sql
          return DataFrame(self._jsparkSession.sql(sqlQuery), self._wrapped)
        File "/DATA/fs11/hadoop/yarn/local/usercache/ingouagn/appcache/application_1552973526615_3878/container_e380_1552973526615_3878_01_000001/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1133, in __call__
          answer, self.gateway_client, self.target_id, self.name)
        File "/DATA/fs11/hadoop/yarn/local/usercache/ingouagn/appcache/application_1552973526615_3878/container_e380_1552973526615_3878_01_000001/pyspark.zip/pyspark/sql/utils.py", line 69, in deco
          raise AnalysisException(s.split(': ', 1)[1], stackTrace)
      pyspark.sql.utils.AnalysisException: 'Table or view not found: q1; line 2 pos 14'
      

       

      Spark: 2.1.1

      Beeline: 1.2.1000

       

      Attachments

        Activity

          People

            Unassigned Unassigned
            igorng Igor Ngouagna
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: