Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-18169

Suppress warnings when dropping views on a dropped table

Log workAgile BoardRank to TopRank to BottomAttach filesAttach ScreenshotBulk Copy AttachmentsBulk Move AttachmentsVotersWatch issueWatchersCreate sub-taskConvert to sub-taskMoveLinkCloneLabelsUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete CommentsDelete
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Minor
    • Resolution: Fixed
    • 2.0.0, 2.0.1
    • None
    • SQL
    • None

    Description

      Apache Spark 2.0.0 ~ 2.0.2-rc1 shows an inconsistent AnalysisException warning message when dropping a view on a dropped table. This does not happen on dropping temporary views. Also, Spark 1.6.x does not show warnings. We had better suppress this to be more consistent in Spark 2.x and with Spark 1.6.x.

      scala> sql("create table t(a int)")
      
      scala> sql("create view v as select * from t")
      
      scala> sql("create temporary view tv as select * from t")
      
      scala> sql("drop table t")
      
      scala> sql("drop view tv")
      
      scala> sql("drop view v")
      16/10/29 15:50:03 WARN DropTableCommand: org.apache.spark.sql.AnalysisException: Table or view not found: `default`.`t`; line 1 pos 91;
      'SubqueryAlias v, `default`.`v`
      +- 'Project ['gen_attr_0 AS a#19]
         +- 'SubqueryAlias t
            +- 'Project ['gen_attr_0]
               +- 'SubqueryAlias gen_subquery_0
                  +- 'Project ['a AS gen_attr_0#18]
                     +- 'UnresolvedRelation `default`.`t`
      
      org.apache.spark.sql.AnalysisException: Table or view not found: `default`.`t`; line 1 pos 91;
      'SubqueryAlias v, `default`.`v`
      +- 'Project ['gen_attr_0 AS a#19]
         +- 'SubqueryAlias t
            +- 'Project ['gen_attr_0]
               +- 'SubqueryAlias gen_subquery_0
                  +- 'Project ['a AS gen_attr_0#18]
                     +- 'UnresolvedRelation `default`.`t`
      ...
      res5: org.apache.spark.sql.DataFrame = []
      

      Note that this is different case of dropping non-exist view. For the non-exist view, Spark raises NoSuchTableException.

      Attachments

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            Unassigned Unassigned Assign to me
            dongjoon Dongjoon Hyun
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment