Uploaded image for project: 'Bigtop'
  1. Bigtop
  2. BIGTOP-2588

Spark 2.0.1 installation fails on DEB

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Blocker
    • Resolution: Fixed
    • Affects Version/s: 1.2.0
    • Fix Version/s: 1.2.0
    • Component/s: spark
    • Labels:
      None
    • Environment:

      only tried on Ubuntu 16.04

      Description

      home/guest/bigtop_cluster/pkg_spark-2.0.1
      + sudo RUNLEVEL=1 dpkg -i spark-core_2.0.1-1_all.deb spark-datanucleus_2.0.1-1_all.deb spark-external_2.0.1-1_all.deb spark-history-server_2.0.1-1_all.deb spark-master_2.0.1-1_all.deb spark-python_2.0.1-1_all.deb spark-thriftserver_2.0.1-1_all.deb spark-worker_2.0.1-1_all.deb spark-yarn-shuffle_2.0.1-1_all.deb
      Selecting previously unselected package spark-core.
      (Reading database ... 61634 files and directories currently installed.)
      Preparing to unpack spark-core_2.0.1-1_all.deb ...
      Unpacking spark-core (2.0.1-1) ...
      Preparing to unpack spark-datanucleus_2.0.1-1_all.deb ...
      Unpacking spark-datanucleus (2.0.1-1) ...
      dpkg: error processing archive spark-datanucleus_2.0.1-1_all.deb (--install):
      trying to overwrite '/usr/lib/spark/jars/datanucleus-rdbms-3.2.9.jar', which is also in package spark-core 2.0.1-1
      dpkg-deb: error: subprocess paste was killed by signal (Broken pipe)

        Activity

        Hide
        jonathak Jonathan Kelly added a comment -

        Sorry about this. It's definitely due to my changes in BIGTOP-2569. Note how I specifically exclude the datanucleus jars from spark-core in the RPM spec file, but I forgot to do something similar in the Debian packaging, and I'm not set up to test on Debian/Ubuntu yet. How would I accomplish this there? Please forgive my lack of knowledge of Debian packaging.

        Show
        jonathak Jonathan Kelly added a comment - Sorry about this. It's definitely due to my changes in BIGTOP-2569 . Note how I specifically exclude the datanucleus jars from spark-core in the RPM spec file, but I forgot to do something similar in the Debian packaging, and I'm not set up to test on Debian/Ubuntu yet. How would I accomplish this there? Please forgive my lack of knowledge of Debian packaging.
        Hide
        oflebbe Olaf Flebbe added a comment - - edited

        I am trying out this fix right now.
        Yup. Works.

        Show
        oflebbe Olaf Flebbe added a comment - - edited I am trying out this fix right now. Yup. Works.
        Hide
        jonathak Jonathan Kelly added a comment -

        Thanks, Olaf Flebbe!

        Show
        jonathak Jonathan Kelly added a comment - Thanks, Olaf Flebbe !
        Hide
        oflebbe Olaf Flebbe added a comment -

        Committing this in a minute

        Show
        oflebbe Olaf Flebbe added a comment - Committing this in a minute
        Hide
        oflebbe Olaf Flebbe added a comment -

        Amir Sanjar Thanks for reporting.

        Show
        oflebbe Olaf Flebbe added a comment - Amir Sanjar Thanks for reporting.
        Hide
        cos Konstantin Boudnik added a comment -

        Jonathan Kelly Excluding datanucleus was known to produce non-working Spark installations. Will it be the case again here?

        Show
        cos Konstantin Boudnik added a comment - Jonathan Kelly Excluding datanucleus was known to produce non-working Spark installations. Will it be the case again here?
        Hide
        jonathak Jonathan Kelly added a comment -

        Konstantin Boudnik, Olaf Flebbe's change is for excluding the datanucleus jars from the spark-core package, since they're already in the separate spark-datanucleus package (something we already do in the RPM packaging and which we used to do in the DEB packaging until BIGTOP-2569 broke it).

        This isn't quite the same thing as what you're referring to, which is (AFAIK) that apparently Spark has some problem if built with Hive support (which we are) but without having the spark-datanucleus package installed at runtime.

        Show
        jonathak Jonathan Kelly added a comment - Konstantin Boudnik , Olaf Flebbe 's change is for excluding the datanucleus jars from the spark-core package, since they're already in the separate spark-datanucleus package (something we already do in the RPM packaging and which we used to do in the DEB packaging until BIGTOP-2569 broke it). This isn't quite the same thing as what you're referring to, which is (AFAIK) that apparently Spark has some problem if built with Hive support (which we are) but without having the spark-datanucleus package installed at runtime.

          People

          • Assignee:
            oflebbe Olaf Flebbe
            Reporter:
            asanjar Amir Sanjar
          • Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

            • Due:
              Created:
              Updated:
              Resolved:

              Development