Uploaded image for project: 'TOREE'
  1. TOREE
  2. TOREE-222

Makefile should provide an install option

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Closed
    • Resolution: Resolved
    • None
    • 0.1.0
    • None
    • None

    Description

      Similarly to how sbt pack would generate a makefile that contained an install option to copy the jars and script to $HOME/local/bin and $HOME/local/lib, we should have an option to install ourselves.

      What this really needs to do is generate the kernel.json files that point to the sparkkernel script. Maybe copy that script and associated assembly jar to a standard location.

      In my old setup, I had four kernel.json files, one per language supported on the kernel.

      {
          "display_name": "Spark 1.5.0 (Scala 2.10.4)",
          "language": "scala",
          "argv": [
              "/Users/senkwich/local/bin/sparkkernel",
              "--profile",
              "{connection_file}",
              "--default-interpreter",
              "scala"
           ],
           "codemirror_mode": "scala",
           "env": {
               "JVM_OPT": "-Xms1024M -Xmx4096M -Dlog4j.logLevel=trace",
              "MAX_INTERPRETER_THREADS": "16",
              "SPARK_CONFIGURATION": "spark.cores.max=4",
              "CAPTURE_STANDARD_OUT": "true",
              "CAPTURE_STANDARD_ERR": "true",
              "SEND_EMPTY_OUTPUT": "false",
              "SPARK_HOME": "/opt/spark-1.5.0-bin-hadoop2.3"
           }
      }
      
      {
          "display_name": "Spark 1.5.0 (Python)",
          "language": "scala",
          "argv": [
              "/Users/senkwich/local/bin/sparkkernel",
              "--profile",
              "{connection_file}",
              "--default-interpreter",
              "pyspark"
           ],
           "codemirror_mode": "python",
           "env": {
               "JVM_OPT": "-Xms1024M -Xmx4096M -Dlog4j.logLevel=trace -cp ~/Downloads/spark-hive_2.10-1.5.1.jar",
              "MAX_INTERPRETER_THREADS": "16",
              "SPARK_CONFIGURATION": "spark.cores.max=4",
              "CAPTURE_STANDARD_OUT": "true",
              "CAPTURE_STANDARD_ERR": "true",
              "SEND_EMPTY_OUTPUT": "false",
              "SPARK_HOME": "/opt/spark-1.5.0-bin-hadoop2.3"
           }
      }
      
      {
          "display_name": "Spark 1.5.0 (R)",
          "language": "scala",
          "argv": [
              "/Users/senkwich/local/bin/sparkkernel",
              "--profile",
              "{connection_file}",
              "--default-interpreter",
              "sparkr"
           ],
           "codemirror_mode": "r",
           "env": {
               "JVM_OPT": "-Xms1024M -Xmx4096M -Dlog4j.logLevel=trace -cp ~/Downloads/spark-hive_2.10-1.5.1.jar",
              "MAX_INTERPRETER_THREADS": "16",
              "SPARK_CONFIGURATION": "spark.cores.max=4",
              "CAPTURE_STANDARD_OUT": "true",
              "CAPTURE_STANDARD_ERR": "true",
              "SEND_EMPTY_OUTPUT": "false",
              "SPARK_HOME": "/opt/spark-1.5.0-bin-hadoop2.3"
           }
      }
      
      {
          "display_name": "Spark 1.5.0 (SQL)",
          "language": "scala",
          "argv": [
              "/Users/senkwich/local/bin/sparkkernel",
              "--profile",
              "{connection_file}",
              "--default-interpreter",
              "sql"
           ],
           "codemirror_mode": "sql",
           "env": {
               "JVM_OPT": "-Xms1024M -Xmx4096M -Dlog4j.logLevel=trace -cp ~/Downloads/spark-hive_2.10-1.5.1.jar",
              "MAX_INTERPRETER_THREADS": "16",
              "SPARK_CONFIGURATION": "spark.cores.max=4",
              "CAPTURE_STANDARD_OUT": "true",
              "CAPTURE_STANDARD_ERR": "true",
              "SEND_EMPTY_OUTPUT": "false",
              "SPARK_HOME": "/opt/spark-1.5.0-bin-hadoop2.3"
           }
      }
      

      Attachments

        Issue Links

          Activity

            People

              Lull3rSkat3r Corey A Stubbs
              chipsenkbeil Chip Senkbeil
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: