When ctrl-c is pressed in shell, behaviors are not consistent across spark-sql, spark-shell, and pyspark resulting in confusion for users. Here is the summary-
|spark-sql||cancels the running job|
|spark-shell||exits the shell|
|pyspark||throws error  and doesn't cancel the job|
Particularly, pyspark is worst because it gives a wrong impression that the job is cancelled although it is not.
Ideally, every shell should act like spark-sql because it allows users to cancel the running job while staying in shell. (Pressing ctrl-c twice exits the shell.)
 pyspark error for ctrl-c
|Ctrl-C in pyspark shell doesn't kill running job||Resolved|