Uploaded image for project: 'Solr'
  1. Solr
  2. SOLR-4

java.io.FileNotFoundException: (Too many open files)

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Blocker
    • Resolution: Invalid
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: update
    • Labels:
      None
    • Environment:

      linux - ubuntu dapper latest jre 1.

      Description

      After adding a few thousand files I get the following error, when adding or deleting:

      <result status="1">java.io.FileNotFoundException: /home/me/Desktop/www/ftpsearch/admin/lucene/database/file_index/data/index/_5o.tii (Too many open files)
      at java.io.RandomAccessFile.open(Native Method)
      at java.io.RandomAccessFile.<init>(Unknown Source)
      at org.apache.lucene.store.FSIndexInput$Descriptor.<init>(FSDirectory.java:425)
      at org.apache.lucene.store.FSIndexInput.<init>(FSDirectory.java:434)
      at org.apache.lucene.store.FSDirectory.openInput(FSDirectory.java:324)
      at org.apache.lucene.index.TermInfosReader.<init>(TermInfosReader.java:52)
      at org.apache.lucene.index.SegmentReader.initialize(SegmentReader.java:147)
      at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:129)
      at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:110)
      at org.apache.lucene.index.IndexReader$1.doBody(IndexReader.java:154)
      at org.apache.lucene.store.Lock$With.run(Lock.java:109)
      at org.apache.lucene.index.IndexReader.open(IndexReader.java:143)
      at org.apache.lucene.index.IndexReader.open(IndexReader.java:127)
      at org.apache.solr.search.SolrIndexSearcher.<init>(Unknown Source)
      at org.apache.solr.core.SolrCore.newSearcher(Unknown Source)
      at org.apache.solr.update.DirectUpdateHandler2.openSearcher(Unknown Source)
      at org.apache.solr.update.DirectUpdateHandler2.doDeletions(Unknown Source)
      at org.apache.solr.update.DirectUpdateHandler2.deleteByQuery(Unknown Source)
      at org.apache.solr.core.SolrCore.update(Unknown Source)
      at org.apache.solr.servlet.SolrServlet.doPost(Unknown Source)
      at javax.servlet.http.HttpServlet.service(HttpServlet.java:767)
      at javax.servlet.http.HttpServlet.service(HttpServlet.java:860)
      at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:408)
      at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:350)
      at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:195)
      at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:164)
      at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:536)
      at org.mortbay.jetty.Server.handle(Server.java:309)
      at org.mortbay.jetty.Server.handle(Server.java:285)
      at org.mortbay.jetty.HttpConnection.doHandler(HttpConnection.java:363)
      at org.mortbay.jetty.HttpConnection.access$1600(HttpConnection.java:45)
      at org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java:625)
      at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:613)
      at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:195)
      at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:297)
      at org.mortbay.jetty.nio.SelectChannelConnector$HttpEndPoint.run(SelectChannelConnector.java:680)
      at org.mortbay.thread.BoundedThreadPool$PoolThread.run(BoundedThreadPool.java:412)

      I have rebuild evrything from scratch several times, but I always get the error.

        Activity

        Hide
        yseeley@gmail.com Yonik Seeley added a comment -

        closing this, since it really isn't a bug.

        Show
        yseeley@gmail.com Yonik Seeley added a comment - closing this, since it really isn't a bug.
        Hide
        yseeley@gmail.com Yonik Seeley added a comment -

        > That should fix it. Is there a way to revent solr from having to open so many files at once? I think that would be a better solution

        Yes, there are many ways to control this.
        Each index contains multiple segments, and each segment contains multiple parts, each stored as a separate file.
        Details are here:http://lucene.apache.org/java/docs/fileformats.html

        You can set useCompoundFile to true in solrconfig.xml to use Lucene's compound file format
        (fragment from the example solrconfig.xml below):

        <config>
        <mainIndex>
        <!-- options specific to the main on-disk lucene index -->
        <useCompoundFile>false</useCompoundFile>
        <mergeFactor>10</mergeFactor>

        The compound file format stuffs all of the segment parts into a single file. It's not as efficient as the non-compound format though.

        Another setting you might want to look at is mergeFactor (also directly from Lucene).
        http://www.onjava.com/pub/a/onjava/2003/03/05/lucene.html?page=1

        A lower mergeFactor (such as 2) will slow down indexing speed, but speed up searchers. It will also result in fewer segments and hence fewer open files.

        Show
        yseeley@gmail.com Yonik Seeley added a comment - > That should fix it. Is there a way to revent solr from having to open so many files at once? I think that would be a better solution Yes, there are many ways to control this. Each index contains multiple segments, and each segment contains multiple parts, each stored as a separate file. Details are here: http://lucene.apache.org/java/docs/fileformats.html You can set useCompoundFile to true in solrconfig.xml to use Lucene's compound file format (fragment from the example solrconfig.xml below): <config> <mainIndex> <!-- options specific to the main on-disk lucene index --> <useCompoundFile>false</useCompoundFile> <mergeFactor>10</mergeFactor> The compound file format stuffs all of the segment parts into a single file. It's not as efficient as the non-compound format though. Another setting you might want to look at is mergeFactor (also directly from Lucene). http://www.onjava.com/pub/a/onjava/2003/03/05/lucene.html?page=1 A lower mergeFactor (such as 2) will slow down indexing speed, but speed up searchers. It will also result in fewer segments and hence fewer open files.
        Hide
        zootreeves Ben reeves added a comment -

        Ok i found the olution, it is not really a solr bug.

        If you are using linux you need to run:
        #su root
        #ulmit -n 1000000

        That should fix it. Is there a way to revent solr from having to open so many files at once? I think that would be a better solution

        Show
        zootreeves Ben reeves added a comment - Ok i found the olution, it is not really a solr bug. If you are using linux you need to run: #su root #ulmit -n 1000000 That should fix it. Is there a way to revent solr from having to open so many files at once? I think that would be a better solution

          People

          • Assignee:
            Unassigned
            Reporter:
            zootreeves Ben reeves
          • Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development