Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-4769

CTAS does not work when reading from temporary tables

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Critical
    • Resolution: Fixed
    • 1.2.0
    • 1.2.0
    • SQL
    • None

    Description

      test("double nested data") {
          sparkContext.parallelize(Nested1(Nested2(Nested3(1))) ::
      Nil).registerTempTable("nested")
          checkAnswer(
            sql("SELECT f1.f2.f3 FROM nested"),
            1)
          checkAnswer(sql("CREATE TABLE test_ctas_1234 AS SELECT * from nested"),
      Seq.empty[Row])
          checkAnswer(
            sql("SELECT * FROM test_ctas_1234"),
            sql("SELECT * FROM nested").collect().toSeq)
        }
      
      11:57:15.974 ERROR org.apache.hadoop.hive.ql.parse.SemanticAnalyzer:
      org.apache.hadoop.hive.ql.parse.SemanticException: Line 1:45 Table not found
      'nested'
              at
      org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1243)
              at
      org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1192)
              at
      org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9209)
              at
      org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
              at
      org.apache.spark.sql.hive.execution.CreateTableAsSelect.metastoreRelation$lzycompute(CreateTableAsSelect.scala:59)
              at
      org.apache.spark.sql.hive.execution.CreateTableAsSelect.metastoreRelation(CreateTableAsSelect.scala:55)
              at
      org.apache.spark.sql.hive.execution.CreateTableAsSelect.sideEffectResult$lzycompute(CreateTableAsSelect.scala:82)
              at
      org.apache.spark.sql.hive.execution.CreateTableAsSelect.sideEffectResult(CreateTableAsSelect.scala:70)
              at
      org.apache.spark.sql.hive.execution.CreateTableAsSelect.execute(CreateTableAsSelect.scala:89)
              at
      org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
              at
      org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
              at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
              at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:105)
              at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:103)
      

      Attachments

        Issue Links

          Activity

            People

              chenghao Cheng Hao
              marmbrus Michael Armbrust
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: