Uploaded image for project: 'Ambari'
  1. Ambari
  2. AMBARI-18112

Fix spark.executor.extraLibraryPath to include native gpl library

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • trunk, 2.4.0, 2.5.0
    • trunk, 2.4.1
    • ambari-server
    • None

    Description

      When spark applications run in yarn cluster mode, there is an error from executors:
      ---------------------------------------------------------------------------------
      java.lang.UnsatisfiedLinkError: no gplcompression in java.library.path
      at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1889)
      at java.lang.Runtime.loadLibrary0(Runtime.java:849)
      at java.lang.System.loadLibrary(System.java:1088)
      at com.hadoop.compression.lzo.GPLNativeCodeLoader.<clinit>(GPLNativeCodeLoader.java:32)
      at com.hadoop.compression.lzo.LzoCodec.<clinit>(LzoCodec.java:71)
      at java.lang.Class.forName0(Native Method)
      .......
      org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      at java.lang.Thread.run(Thread.java:745)
      16/08/08 12:59:03 ERROR LzoCodec: Cannot load native-lzo without native-hadoop.

      In the patch, it will set “spark.executor.extraLibraryPath /usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64” in spark-defaults.conf.

      Attachments

        1. AMBARI-18112_v1.patch
          4 kB
          Weiqing Yang
        2. AMBARI-18112_v0.patch
          4 kB
          Weiqing Yang

        Issue Links

          Activity

            People

              WeiqingYang Weiqing Yang
              WeiqingYang Weiqing Yang
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: