Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
2.0.0
-
None
Description
When I executed pyspark on Windows, it failed.
> bin\pyspark
"C:\Users\tsudukim\Documents\workspace\spark-dev3\bin\"
Python 2.7.8 (default, Jun 30 2014, 16:03:49) [MSC v.1500 32 bit (Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
'spark-submit2.cmd' is not recognized as an internal or external command,
operable program or batch file.
Traceback (most recent call last):
File "C:\Users\tsudukim\Documents\workspace\spark-dev3\bin\..\python\pyspark\shell.py", line 38, in <module>
sc = SparkContext()
File "C:\Users\tsudukim\Documents\workspace\spark-dev3\python\pyspark\context.py", line 112, in _init_
SparkContext._ensure_initialized(self, gateway=gateway)
File "C:\Users\tsudukim\Documents\workspace\spark-dev3\python\pyspark\context.py", line 245, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway()
File "C:\Users\tsudukim\Documents\workspace\spark-dev3\python\pyspark\java_gateway.py", line 94, in launch_gateway
raise Exception("Java gateway process exited before sending the driver its port number")
Exception: Java gateway process exited before sending the driver its port number
>>>
Attachments
Attachments
Issue Links
- relates to
-
SPARK-11518 The script spark-submit.cmd can not handle spark directory with space.
- Resolved
- links to