Details
-
Bug
-
Status: Resolved
-
Minor
-
Resolution: Fixed
-
None
-
None
Description
On Windows, when we launch bin\spark-shell.cmd, it shows ERROR message and stacktrace.
C:\Users\tsudukim\Documents\workspace\spark-dev3>bin\spark-shell [ERROR] Terminal initialization failed; falling back to unsupported java.lang.NoClassDefFoundError: Could not initialize class scala.tools.fusesource_embedded.jansi.internal.Kernel32 at scala.tools.fusesource_embedded.jansi.internal.WindowsSupport.getConsoleMode(WindowsSupport.java:50) at scala.tools.jline_embedded.WindowsTerminal.getConsoleMode(WindowsTerminal.java:204) at scala.tools.jline_embedded.WindowsTerminal.init(WindowsTerminal.java:82) at scala.tools.jline_embedded.TerminalFactory.create(TerminalFactory.java:101) at scala.tools.jline_embedded.TerminalFactory.get(TerminalFactory.java:158) at scala.tools.jline_embedded.console.ConsoleReader.<init>(ConsoleReader.java:229) at scala.tools.jline_embedded.console.ConsoleReader.<init>(ConsoleReader.java:221) at scala.tools.jline_embedded.console.ConsoleReader.<init>(ConsoleReader.java:209) at scala.tools.nsc.interpreter.jline_embedded.JLineConsoleReader.<init>(JLineReader.scala:61) at scala.tools.nsc.interpreter.jline_embedded.InteractiveReader.<init>(JLineReader.scala:33) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:422) at scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiate$1$1.apply(ILoop.scala:865) at scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiate$1$1.apply(ILoop.scala:862) at scala.tools.nsc.interpreter.ILoop.scala$tools$nsc$interpreter$ILoop$$mkReader$1(ILoop.scala:871) at scala.tools.nsc.interpreter.ILoop$$anonfun$15$$anonfun$apply$8.apply(ILoop.scala:875) at scala.tools.nsc.interpreter.ILoop$$anonfun$15$$anonfun$apply$8.apply(ILoop.scala:875) at scala.util.Try$.apply(Try.scala:192) at scala.tools.nsc.interpreter.ILoop$$anonfun$15.apply(ILoop.scala:875) at scala.tools.nsc.interpreter.ILoop$$anonfun$15.apply(ILoop.scala:875) at scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418) at scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418) at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1233) at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1223) at scala.collection.immutable.Stream.collect(Stream.scala:435) at scala.tools.nsc.interpreter.ILoop.chooseReader(ILoop.scala:877) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$2.apply(ILoop.scala:916) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:916) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911) at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97) at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:911) at org.apache.spark.repl.Main$.doMain(Main.scala:64) at org.apache.spark.repl.Main$.main(Main.scala:47) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:737) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). 16/03/07 13:05:32 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Spark context available as sc (master = local[*], app id = local-1457323533704). SQL context available as sqlContext. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.0.0-SNAPSHOT /_/ Using Scala version 2.11.7 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_40) Type in expressions to have them evaluated. Type :help for more information. scala> sc.textFile("README.md") res0: org.apache.spark.rdd.RDD[String] = README.md MapPartitionsRDD[1] at textFile at <console>:25 scala> sc.textFile("README.md").count() res1: Long = 97
Spark-shell itself seems to work file during my simple operation check.