Details

    • Sub-task
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.4.0
    • 3.4.0
    • Connect, Tests
    • None

    Description

      Spark Connect does not report coverage so far, and our scheduled coverage job fails as below:

      ======================================================================
      1297ERROR [0.000s]: setUpClass (pyspark.sql.tests.connect.test_connect_function.SparkConnectFunctionTests)
      1298----------------------------------------------------------------------
      1299Traceback (most recent call last):
      1300  File "/__w/spark/spark/python/pyspark/sql/tests/connect/test_connect_function.py", line 37, in setUpClass
      1301    ReusedPySparkTestCase.setUpClass()
      1302  File "/__w/spark/spark/python/pyspark/testing/utils.py", line 135, in setUpClass
      1303    cls.sc = SparkContext("local[4]", cls.__name__, conf=cls.conf())
      1304  File "/__w/spark/spark/python/pyspark/context.py", line 196, in __init__
      1305    self._do_init(
      1306  File "/__w/spark/spark/python/pyspark/context.py", line 283, in _do_init
      1307    self._jsc = jsc or self._initialize_context(self._conf._jconf)
      1308  File "/__w/spark/spark/python/pyspark/context.py", line 413, in _initialize_context
      1309    return self._jvm.JavaSparkContext(jconf)
      1310  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1587, in __call__
      1311    return_value = get_return_value(
      1312  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/protocol.py", line 326, in get_return_value
      1313    raise Py4JJavaError(
      1314py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
      1315: java.io.IOException: Failed to bind to address 0.0.0.0/0.0.0.0:15002
      1316	at org.sparkproject.connect.grpc.io.grpc.netty.NettyServer.start(NettyServer.java:328)
      1317	at org.sparkproject.connect.grpc.io.grpc.internal.ServerImpl.start(ServerImpl.java:183)
      1318	at org.sparkproject.connect.grpc.io.grpc.internal.ServerImpl.start(ServerImpl.java:92)
      1319	at org.apache.spark.sql.connect.service.SparkConnectService$.startGRPCService(SparkConnectService.scala:217)
      1320	at org.apache.spark.sql.connect.service.SparkConnectService$.start(SparkConnectService.scala:222)
      1321	at org.apache.spark.sql.connect.SparkConnectPlugin$$anon$1.init(SparkConnectPlugin.scala:48)
      1322	at org.apache.spark.internal.plugin.DriverPluginContainer.$anonfun$driverPlugins$1(PluginContainer.scala:53)
      1323	at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:293)
      1324	at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
      1325	at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
      1326	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
      1327	at scala.collection.TraversableLike.flatMap(TraversableLike.scala:293)
      1328	at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:290)
      1329	at scala.collection.AbstractTraversable.flatMap(Traversable.scala:108)
      1330	at org.apache.spark.internal.plugin.DriverPluginContainer.<init>(PluginContainer.scala:46)
      1331	at org.apache.spark.internal.plugin.PluginContainer$.apply(PluginContainer.scala:210)
      1332	at org.apache.spark.internal.plugin.PluginContainer$.apply(PluginContainer.scala:193)
      1333	at org.apache.spark.SparkContext.<init>(SparkContext.scala:565)
      1334	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
      1335	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      1336	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
      1337	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      1338	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
      1339	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
      1340	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374)
      1341	at py4j.Gateway.invoke(Gateway.java:238)
      1342	at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
      1343	at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
      1344	at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
      1345	at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
      1346	at java.lang.Thread.run(Thread.java:750)
      1347Caused by: io.netty.channel.unix.Errors$NativeIoException: bind(..) failed: Address already in use
      1348 

       

      https://github.com/apache/spark/actions/runs/3663716955/jobs/6193686439

       

      Should enable this

      Attachments

        Activity

          People

            gurwls223 Hyukjin Kwon
            gurwls223 Hyukjin Kwon
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: