Uploaded image for project: 'Zeppelin'
  1. Zeppelin
  2. ZEPPELIN-5423

Running %spark.conf cell seem to break placeholder #{user} from config

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 0.9.0
    • 0.9.1, 0.10.0
    • zeppelin-interpreter
    • None

    Description

      Hello,

      First of all let me preface this by saying I'm by no means hyper well-versed in zeppelin/spark so please let me know if this behavior is intended or the issue doesn't make sense.

      (Please note all the below is in a zeppelin + spark on kubernetes setup).

      So the issue is as follow:

      I have a config for the spark interpreter in zeppelin, in which I use several times the #{user} "placeholder" (not sure about the correct term here) to set some config values to the LDAP user name of the current user. Most importantly I use a config "user.name" set to "#{user}", that I then use for authentication stuff to other systems in a custom library that I have in my spark classpath. This works perfectly fine. 

      Now, for some notebooks I am trying to tweak it with an additional %spark.conf cell on top of the note.  I just do something along these lines:

       

      %spark.conf
      spark.executor.instances 7
      
      spark.executor.memory 3G
      
      spark.executor.cores 2
      
      

      This executes fine, but the issue comes up when I actually call my library and it tries to get the "user.name" value, I get the following output: 

      java.util.NoSuchElementException: user.name. 

       

      To simplify the example, I just did the %spark.conf cell, and then just run:

      %spark
      sc.getConf.get("user.name")
      

      Which yields the same output:

      java.util.NoSuchElementException: user.name. 

       

       Initially I thought that %spark.conf might reset the whole  config and you just have to explicitly write all the config values but obvisouly it is not the case as the rest of my config is here indeed. 

       

      That's what leads me to believe that there is somehow a broken/unexpected interaction between spark.conf cells and #{user} config value.

       

      Thanks for any help,

      Regards,

       

      Theo.

       

      Attachments

        Activity

          People

            zjffdu Jeff Zhang
            theosardin Theo Sardin
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Time Tracking

                Estimated:
                Original Estimate - Not Specified
                Not Specified
                Remaining:
                Remaining Estimate - 0h
                0h
                Logged:
                Time Spent - 40m
                40m