This tokenizer segments sentences into words, but doesnt have a test for multiple sentences.
Since its not yet released, it would be good to fix for 4.8 so no user sees the bug.
Robert Muir made changes -
|Status||Open [ 1 ]||Resolved [ 5 ]|
|Resolution||Fixed [ 1 ]|
Uwe Schindler made changes -
|Status||Resolved [ 5 ]||Closed [ 6 ]|
|Transition||Time In Source Status||Execution Times||Last Executer||Last Execution Date|
|51m 25s||1||Robert Muir||11/Apr/14 11:20|
|16d 12h 4m||1||Uwe Schindler||27/Apr/14 23:25|