Uploaded image for project: 'Zeppelin'
  1. Zeppelin
  2. ZEPPELIN-313

Allow different SQLContexts in %spark interpreter

    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: Open
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: Interpreters

      Description

      I'm currently using the %spark interpreter and CassandraSQLContext to load data from my cassandra cluster. The %cassandra interpreter cannot be used because the results have to be postprocessed.
      The problem is that the %sql interpreter from %spark defaults to the sqlContext variable and you are not able to override it. So every table registered using a CassandraSQLContext is not accessible by the default sqlContext used by %sql.
      IMHO there are two possible solutions:

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              wienczny Stephan Wienczny
            • Votes:
              1 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

              • Created:
                Updated: