Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Invalid
-
None
-
None
-
None
-
None
-
Windows Server 2003
Description
We are using Lucene .NET Beta 1.9 version 1.
We have been indexing about 300 000 documents with a compound index of 630 Mb and using frequently the optimize function to compress without problem.
As we kept indexing more documents (300,000 +) into this file, we are getting a corrupted index and the optimization stop (file stop growing up at about 480 Mb) and never complete the optimization correctly.
Does anybody have seen this problem ?
Is there a way to avoid index corruption ?
What could be the corruption source ?
Does the corrupted index could be repaired ?
Should we create multiple index ?