Details
-
Wish
-
Status: Resolved
-
Minor
-
Resolution: Fixed
-
None
-
None
-
None
-
New
Description
This came up when I was trying to think about improving the experience for new contributors.
Today, new contributors are usually unaware of where luceneutil benchmarks are and when/how to run them. Committers usually end up pointing contributors to the benchmarks package when they make perf impacting changes and then they run the benchmarks.
Adding benchmark details to the Lucene repo will also make them more accessible to other researchers who want to experiment/benchmark their own custom task implementation with Java Lucene.
What does the community think?
Attachments
Issue Links
- links to