Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-7006

Inconsistent behavior for ctrl-c in Spark shells

    XMLWordPrintableJSON

Details

    • Wish
    • Status: Resolved
    • Minor
    • Resolution: Duplicate
    • 1.3.1
    • None
    • Spark Shell, YARN
    • YARN

    Description

      When ctrl-c is pressed in shell, behaviors are not consistent across spark-sql, spark-shell, and pyspark resulting in confusion for users. Here is the summary-

      shell after ctrl-c
      spark-sql cancels the running job
      spark-shell exits the shell
      pyspark throws error [1] and doesn't cancel the job

      Particularly, pyspark is worst because it gives a wrong impression that the job is cancelled although it is not.

      Ideally, every shell should act like spark-sql because it allows users to cancel the running job while staying in shell. (Pressing ctrl-c twice exits the shell.)

      [1] pyspark error for ctrl-c

      Traceback (most recent call last):
        File "<stdin>", line 1, in <module>
        File "/home/cheolsoop/spark/jars/spark-1.3.1/python/pyspark/sql/dataframe.py", line 284, in count
          return self._jdf.count()
        File "/home/cheolsoop/spark/jars/spark-1.3.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 536, in __call__
        File "/home/cheolsoop/spark/jars/spark-1.3.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 364, in send_command
        File "/home/cheolsoop/spark/jars/spark-1.3.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 473, in send_command
        File "/usr/lib/python2.7/socket.py", line 430, in readline
          data = recv(1)
      KeyboardInterrupt
      

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              cheolsoo Cheolsoo Park
              Votes:
              2 Vote for this issue
              Watchers:
              6 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: