We have found that system commands can be injected into Spark interpreter settings (maybe other interpreter's settings are affected as well). This injection will be executed when any Spark job will run.
This injection could be a security issue in environments where users have permissions to change interpreter settings - then it lead to local privilege escalation (normally user can execute notes with special user that is used for interpreter, but injected commands will be executed with zeppelin user).
(this example will also crash Spark jobs because command line is truncated)
or another example, in this case Spark jobs will be executed normally without interruption
Parameters should be validated and filtered to prevent injections into the command line.