Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-22168

py4j.protocol.Py4JNetworkError: Error while receiving Socket.timeout: timed out

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Invalid
    • Affects Version/s: 2.2.0
    • Fix Version/s: None
    • Component/s: PySpark
    • Labels:
    • Environment:

      Linux - Ubuntu 14.04, Python 3.4

    • Flags:
      Important

      Description

      Hi all,

      I am looking for a resolution / workaround for the below problem. It will be helpful If somebbody can suggest a quick solution to this problem

      Traceback (most recent call last):
      File "/usr/local/spark-2.2.0-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1028, in send_command
      answer = smart_decode(self.stream.readline()[:-1])
      File "/usr/lib/python3.4/socket.py", line 374, in readinto
      return self._sock.recv_into(b)
      socket.timeout: timed out

      During handling of the above exception, another exception occurred:

      Traceback (most recent call last):
      File "/usr/local/spark-2.2.0-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 883, in send_command
      response = connection.send_command(command)
      File "/usr/local/spark-2.2.0-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1040, in send_command
      "Error while receiving", e, proto.ERROR_ON_RECEIVE)
      py4j.protocol.Py4JNetworkError: Error while receiving
      Process Process-1:
      Traceback (most recent call last):
      File "/usr/lib/python3.4/multiprocessing/process.py", line 254, in _bootstrap
      self.run()
      File "/usr/lib/python3.4/multiprocessing/process.py", line 93, in run
      self._target(*self._args, **self._kwargs)
      File "/usr/local/spark-2.2.0-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1133, in _call_
      answer, self.gateway_client, self.target_id, self.name)
      File "/usr/local/spark-2.2.0-bin-hadoop2.7/python/pyspark/sql/utils.py", line 63, in deco
      return f(*a, **kw)
      File "/usr/local/spark-2.2.0-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py", line 327, in get_return_value
      format(target_id, ".", name))
      py4j.protocol.Py4JError: An error occurred while calling o180.fit

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              kpnarayanan Krishnaprasad
            • Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: