Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-24830

Problem with logging on Glassfish

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Incomplete
    • 2.3.0
    • None
    • Spark Core
    • Glassfish 4.1.1, 4.1.2, java 1.8.0_151-b12, Windows 10 or Linux-Fedora 20 

    • Hide
      In order to reproduce the problem need to make simple web project which will call such code on server side in this application:
      public void sparkTest() {
              try {

                  Logger.getLogger("net.tp.test.data").log(Level.INFO, "Info 1");
                  SparkConf conf = new SparkConf().setAppName("Test");
                  conf.initializeLogIfNecessary(false);
                  Logger.getLogger("net.tp.test.data").log(Level.INFO, "Info 2");

              } catch (Throwable ex) {
                  Logger.getLogger("net.tp.test.data").log(Level.SEVERE, null, ex);
              }
          }
      Having spark-yarn_2.11 and spark-sql_2.11 maven artefacts included in project.
      But this simple deployment into Glassfish won't work. Need also replace guava which is referenced in spark-network-common_2.11 to version 16.0.1 (I did it by creating new standalone project with guava 16 and placed it into Glassfish's domain lib directory so that it would loaded first). Also need to replace Jackson libraries of Glassfish server to version 2.6.7/2.6.7.1 (for this particular test possible to omit)
      Show
      In order to reproduce the problem need to make simple web project which will call such code on server side in this application: public void sparkTest() {         try {             Logger.getLogger("net.tp.test.data").log(Level.INFO, "Info 1");             SparkConf conf = new SparkConf().setAppName("Test");             conf.initializeLogIfNecessary(false);             Logger.getLogger("net.tp.test.data").log(Level.INFO, "Info 2");         } catch (Throwable ex) {             Logger.getLogger("net.tp.test.data").log(Level.SEVERE, null, ex);         }     } Having spark-yarn_2.11 and spark-sql_2.11 maven artefacts included in project. But this simple deployment into Glassfish won't work. Need also replace guava which is referenced in spark-network-common_2.11 to version 16.0.1 (I did it by creating new standalone project with guava 16 and placed it into Glassfish's domain lib directory so that it would loaded first). Also need to replace Jackson libraries of Glassfish server to version 2.6.7/2.6.7.1 (for this particular test possible to omit)

    Description

      When driver program is running inside J2EE application (in yarn-client mode) all log messages of Glassfish server are not going into Glassfish log after when Spark application is started. In order to see these messages need to setup log4j in application with spark, so that messages would go into file(log4j.rootLogger=INFO, file). That give the possibility to see that is happening, but it's just workaround. Logs for all other applications, deployed on Glassfish are affected, while these applications should be in the isolated environments.

      Note: the matter is about "usual"  log messages before-after driver program got answers, not about these, which could be originated from the code of tasks executed on workers.

       

      Attachments

        Activity

          People

            Unassigned Unassigned
            vadz vadim
            Votes:
            1 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: