Uploaded image for project: 'TOREE'
  1. TOREE
  2. TOREE-166

sqlContext not shared with PySpark and sparkR

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Resolution: Fixed
    • None
    • 0.1.0
    • None
    • None

    Description

      The scala interpreter and sql interpreter appear to share the same sqlContext and you can select tables in the sql interpreter that were registered in the scala interpreter. However, It appears that the PySpark and SparkR interpreters each create their own sqlContext on construction, and dataframes registered on those sqlContext will not be shared with the sqlContext in other interpreters. Would it be possible to change it so that the python and R interpreters were instantiated with the same sqlContext as the scala interpreter?

      Attachments

        Activity

          People

            lbustelo Gino Bustelo
            nimbusgo nimbusgo
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: