Uploaded image for project: 'Apache Sedona'
  1. Apache Sedona
  2. SEDONA-181

Build fails with java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$

Log workAgile BoardRank to TopRank to BottomAttach filesAttach ScreenshotBulk Copy AttachmentsBulk Move AttachmentsVotersWatch issueWatchersCreate sub-taskConvert to sub-taskLinkCloneLabelsUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 1.3.0

    Description

      In fact I have a bit weird env setup
      from one side my deafult java is java19, from another side in JAVA_HOME there is java8.
      scalatest maven plugin invokes default java instead of java from JAVA_HOME and it fails as

      2022-10-22 18:37:55,684 INFO  [ScalaTest-main-running-scalaTest] storage.BlockManagerMasterEndpoint (Logging.scala:logInfo(61)) - BlockManagerMasterEndpoint up
      *** RUN ABORTED ***
        java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x2c5ff176) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x2c5ff176
        at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:213)
        at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
        at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:114)
        at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:353)
        at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:290)
        at org.apache.spark.SparkEnv$.create(SparkEnv.scala:339)
        at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:194)
        at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:279)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:464)
        at org.apache.sedona.core.SparkUtil$.sc$lzycompute(SparkUtil.scala:39)
      
      

      I suspect the reason is a bug in scalatest-maven-plugin https://github.com/scalatest/scalatest-maven-plugin/issues/26
      so I would suggest to update a version of it

      Attachments

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            Unassigned Unassigned Assign to me
            Sergey Nuyanzin Sergey Nuyanzin
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Time Tracking

              Estimated:
              Original Estimate - Not Specified
              Not Specified
              Remaining:
              Remaining Estimate - 0h
              0h
              Logged:
              Time Spent - 20m
              20m

              Slack

                Issue deployment