Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-5010

native openblas library doesn't work: undefined symbol: cblas_dscal

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Minor
    • Resolution: Not A Problem
    • 1.3.0
    • None
    • MLlib
    • standalone

    Description

      1. compiled and installed open blas library
      2, ln -s libopenblas_sandybridgep-r0.2.13.so /usr/lib/libblas.so.3
      3. compiled and built spark:
      mvn -Pnetlib-lgpl -DskipTests clean compile package
      4. run: bin/run-example mllib.LinearRegression data/mllib/sample_libsvm_data.txt

      14/12/30 18:39:57 INFO BlockManagerMaster: Trying to register BlockManager
      14/12/30 18:39:57 INFO BlockManagerMasterActor: Registering block manager localhost:34297 with 265.1 MB RAM, BlockManagerId(<driver>, localhost, 34297)
      14/12/30 18:39:57 INFO BlockManagerMaster: Registered BlockManager
      14/12/30 18:39:58 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
      14/12/30 18:39:58 WARN LoadSnappy: Snappy native library not loaded
      Training: 80, test: 20.
      /usr/local/lib/jdk1.8.0//bin/java: symbol lookup error: /tmp/jniloader1826801168744171087netlib-native_system-linux-x86_64.so: undefined symbol: cblas_dscal

      I followed guide: https://spark.apache.org/docs/latest/mllib-guide.html section dependencies.

      Am I missing something?
      How to force Spark to use openblas library?

      Thanks, Tomas

      Attachments

        Activity

          People

            Unassigned Unassigned
            xhudik Tomas Hudik
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: