Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-39457 Support IPv6-only environment
  3. SPARK-39459

local*HostName* methods should support IPv6

    XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.4.0
    • 3.4.0
    • Spark Core
    • None

    Description

      ➜  ./bin/spark-shell
      22/06/09 14:52:35 WARN Utils: Your hostname, DBs-Mac-mini-2.local resolves to a loopback address: 127.0.0.1; using 2600:1700:1151:11ef:0:0:0:2000 instead (on interface en1)
      22/06/09 14:52:35 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
      Setting default log level to "WARN".
      To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
      22/06/09 14:52:43 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
      22/06/09 14:52:44 ERROR SparkContext: Error initializing SparkContext.
      java.lang.AssertionError: assertion failed: Expected hostname or IPv6 IP enclosed in [] but got 2600:1700:1151:11ef:0:0:0:2000
      	at scala.Predef$.assert(Predef.scala:223) ~[scala-library-2.12.15.jar:?]
      	at org.apache.spark.util.Utils$.checkHost(Utils.scala:1110) ~[spark-core_2.12-3.2.0.jar:3.2.0.37]
      	at org.apache.spark.executor.Executor.<init>(Executor.scala:89) ~[spark-core_2.12-3.2.0.jar:3.2.0.37]
      	at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:64) ~[spark-core_2.12-3.2.0.jar:3.2.0]
      	at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:132) ~[spark-core_2.12-3.2.0.jar:3.2.0]
      

      Attachments

        Activity

          People

            dongjoon Dongjoon Hyun
            dbtsai DB Tsai
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: