Solr
  1. Solr
  2. SOLR-1000

DIH FileListEntityProcessor fileName filters directory names and stops recursion

    Details

    • Type: Improvement Improvement
    • Status: Closed
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: 1.3
    • Fix Version/s: 1.4
    • Labels:
      None

      Description

      I have been trying to find out why DIH in FileListEntityProcessor mode did not appear to be recursing into subdirectories. Going through FileListEntityProcessor.java I eventually tumbled to the fact that my filename filter setting from data-config.xml also applied to directory names.

      Now, I feel that the fieldName filter should be applied to files fed into the parser, it should not be applied to the directory names we are recursing through. I bodged the code to adjust the behavior so that the "FileName" and "excludes" attributes of "entity" only apply to filenames and not directory names. It now recurses though my directory tree only indexing the appropriate files! I think the new behavior is more standard.

      I will submit the a patch once I have constructed one!

      1. SOLR-1000.patch
        6 kB
        Fergus McMenemie
      2. SOLR-1000.patch
        5 kB
        Fergus McMenemie

        Activity

        Hide
        Fergus McMenemie added a comment -

        Here is my first attempt at a patch, it seems to work OK however the testcase I added TestFileListEntityProcessor.java fails. I need somebody who knows what they are doing to point out what I am doing wrong!

        Show
        Fergus McMenemie added a comment - Here is my first attempt at a patch, it seems to work OK however the testcase I added TestFileListEntityProcessor.java fails. I need somebody who knows what they are doing to point out what I am doing wrong!
        Hide
        Shalin Shekhar Mangar added a comment -

        First the ClassCastException was because AbstractDataImportHandlerTest tries to read a string from the attributes map. But in this case, testRecursive put in a boolean true rather than a string to the 'recursive' attribute. That was fixed by adding string "true" instead of a boolean. I'll fix AbstractDataImportHandlerTest to read String.valueOf to handle these cases in the future.

        After this fix, the assert at the end of the testRecursive failed. This is because it expects to find 3 files but "a.xml", "b.xml" and "c.props" are in the same directory and due to the 'fileName' regex, c.props won't be picked up. I guess you meant to add c.props to another child directory inside the one you are creating?

        Show
        Shalin Shekhar Mangar added a comment - First the ClassCastException was because AbstractDataImportHandlerTest tries to read a string from the attributes map. But in this case, testRecursive put in a boolean true rather than a string to the 'recursive' attribute. That was fixed by adding string "true" instead of a boolean. I'll fix AbstractDataImportHandlerTest to read String.valueOf to handle these cases in the future. After this fix, the assert at the end of the testRecursive failed. This is because it expects to find 3 files but "a.xml", "b.xml" and "c.props" are in the same directory and due to the 'fileName' regex, c.props won't be picked up. I guess you meant to add c.props to another child directory inside the one you are creating?
        Hide
        Fergus McMenemie added a comment -

        Sorted bugs in the Junit test and added a few other improvements to the test.

        Show
        Fergus McMenemie added a comment - Sorted bugs in the Junit test and added a few other improvements to the test.
        Hide
        Shalin Shekhar Mangar added a comment -

        Thanks Fergus.

        One minor thing:

        while (true) {
              Map<String, Object> r = getNext();
              if (r != null) r = applyTransformer(r);
                return r;
            }
        

        In the new code the loop is not used at all. The difference is important because Transformers have the ability to skip documents by doing map.put("$skipDoc", true) on this map. If a document is skipped, applyTransformer will return null and we'd like to request a new row from the data source (entity processor in this case). With this change, null will be returned which signals that the DataSource/EntityProcessor has run out of data even though it has not.

        Except for this, the patch looks great! I'll commit this shortly.

        Show
        Shalin Shekhar Mangar added a comment - Thanks Fergus. One minor thing: while ( true ) { Map< String , Object > r = getNext(); if (r != null ) r = applyTransformer(r); return r; } In the new code the loop is not used at all. The difference is important because Transformers have the ability to skip documents by doing map.put("$skipDoc", true) on this map. If a document is skipped, applyTransformer will return null and we'd like to request a new row from the data source (entity processor in this case). With this change, null will be returned which signals that the DataSource/EntityProcessor has run out of data even though it has not. Except for this, the patch looks great! I'll commit this shortly.
        Hide
        Shalin Shekhar Mangar added a comment -

        Committed revision 740423.

        I reverted the change I mentioned above and I moved the issue to the DIH CHANGES.txt

        Thanks Fergus!

        Show
        Shalin Shekhar Mangar added a comment - Committed revision 740423. I reverted the change I mentioned above and I moved the issue to the DIH CHANGES.txt Thanks Fergus!
        Hide
        Grant Ingersoll added a comment -

        Bulk close Solr 1.4 issues

        Show
        Grant Ingersoll added a comment - Bulk close Solr 1.4 issues

          People

          • Assignee:
            Shalin Shekhar Mangar
            Reporter:
            Fergus McMenemie
          • Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Time Tracking

              Estimated:
              Original Estimate - 24h
              24h
              Remaining:
              Remaining Estimate - 24h
              24h
              Logged:
              Time Spent - Not Specified
              Not Specified

                Development