Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-776

Support adding jars to Spark shell

Log workAgile BoardRank to TopRank to BottomAttach filesAttach ScreenshotBulk Copy AttachmentsBulk Move AttachmentsVotersWatch issueWatchersCreate sub-taskConvert to sub-taskMoveLinkCloneLabelsUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete CommentsDelete
    XMLWordPrintableJSON

Details

    • New Feature
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 0.7.3, 0.8.0
    • None
    • None

    Description

      We should add a mechanism to add additional jars to jobs run in the Spark shell, since addJar() doesn't work there (see https://github.com/mesos/spark/pull/359).

      There's a proposal/patch at https://groups.google.com/forum/?fromgroups#!searchin/spark-users/ADD_JAR/spark-users/IBgbLoFWbxw/9AzTrN_iwz4J, but someone needs to test it and submit it as a pull request.

      Spark should also emit warnings / errors when trying to call addJar() within spark-shell.

      Attachments

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            matei Matei Alexandru Zaharia Assign to me
            joshrosen Josh Rosen
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment