Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-6771

Table alias in Spark SQL

Attach filesAttach ScreenshotVotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Question
    • Status: Resolved
    • Major
    • Resolution: Invalid
    • 1.2.1
    • None
    • SQL
    • None
    • Spark 1.2.1 build with Hive 0.13.1 on YARN 2.5.2

    Description

      I had cache several tables in memory , I found Spark will using Hadoop data when i running a SQL with table alias , is there any configuration can fix it ?

      Attachments

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            Unassigned Unassigned
            guoqing Jacky Guo
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment