Uploaded image for project: 'Zeppelin'
  1. Zeppelin
  2. ZEPPELIN-1565

Unable to query from Mongodb from Zeppelin using spark

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Critical
    • Resolution: Not A Problem
    • 0.6.0
    • 0.7.0
    • zeppelin-server
    • None

    Description

      Hi Team,

      we are trying to connect to query data from Mongo database from Zeppelin using spark and we are getting below exception.

      can you please look into this and advise what could be the problem. we are query from Zeppelin Notebook.

      Query:

      %spark

      val options1 = Map("spark.mongodb.input.uri" -> "mongodb://user/password@serverip:37017",
      "spark.mongodb.input.database" -> "dataing",
      "spark.mongodb.input.collection" -> "MEM",
      "spark.mongodb.input.readPreference.name" -> "primaryPreferred")
      val df1 = sqlContext.read.format("com.mongodb.spark.sql").options(options1).load()
      //MongoSpark.load(sqlContext, options1)

      println("df1 Schema:")
      df1.printSchema()

      df1.registerTempTable("MEM")

      val sql1 = "SELECT DB_NAME FROM MEM"
      val results1 = sqlContext.sql(sql1)
      results1.show()

      Error details:
      -------------

      otions1: scala.collection.immutable.Map[String,String] = Map(spark.mongodb.input.uri -> mongodb://rsinha/rsinha123@10.11.5.78:37017, spark.mongodb.input.database -> dataing, spark.mongodb.input.collection -> MEM, spark.mongodb.input.readPreference.name -> primaryPreferred)
      java.lang.IllegalArgumentException: Missing database name. Set via the 'spark.mongodb.input.uri' or 'spark.mongodb.input.database' property
      at com.mongodb.spark.config.MongoCompanionConfig$class.databaseName(MongoCompanionConfig.scala:169)
      at com.mongodb.spark.config.ReadConfig$.databaseName(ReadConfig.scala:35)
      at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:46)
      at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:35)
      at com.mongodb.spark.config.MongoCompanionConfig$class.apply(MongoCompanionConfig.scala:83)
      at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:35)
      at com.mongodb.spark.config.MongoCompanionConfig$class.apply(MongoCompanionConfig.scala:73)
      at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:35)
      at com.mongodb.spark.sql.DefaultSource.connectorAndReadConfig(DefaultSource.scala:127)
      at com.mongodb.spark.sql.DefaultSource.createRelation(DefaultSource.scala:66)
      at com.mongodb.spark.sql.DefaultSource.createRelation(DefaultSource.scala:52)
      at com.mongodb.spark.sql.DefaultSource.createRelation(DefaultSource.scala:37)
      at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158)
      at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
      at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
      at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
      at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:37)
      at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:39)
      at $iwC$$iwC$$iwC$$iwC.<init>(<console>:41)
      at $iwC$$iwC$$iwC.<init>(<console>:43)
      at $iwC$$iwC.<init>(<console>:45)
      at $iwC.<init>(<console>:47)
      at <init>(<console>:49)
      at .<init>(<console>:53)
      at .<clinit>(<console>)
      at .<init>(<console>:7)
      at .<clinit>(<console>)
      at $print(<console>)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      at java.lang.reflect.Method.invoke(Method.java:606)
      at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1045)
      at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1326)
      at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:821)
      at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:852)
      at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:800)
      at org.apache.zeppelin.spark.SparkInterpreter.interpretInput(SparkInterpreter.java:810)
      at org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:753)
      at org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:746)
      at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:94)
      at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
      at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
      at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
      at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
      at java.util.concurrent.FutureTask.run(FutureTask.java:262)
      at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
      at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      at java.lang.Thread.run(Thread.java:745)

      Attachments

        Activity

          People

            lresende Luciano Resende
            Ajayyodlee Ajay Chaudhary
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: