Uploaded image for project: 'OpenNLP'
  1. OpenNLP
  2. OPENNLP-1375

Enable optional GPU inference in ONNX Runtime config

    XMLWordPrintableJSON

Details

    • Task
    • Status: Closed
    • Major
    • Resolution: Fixed
    • 2.0.0
    • 2.1.0
    • None
    • None

    Description

      Enable optional GPU inference in ONNX Runtime config. Expose a property (probably through a constructor) to enable GPU inference when doing inference using ONNX Runtime.

      Attachments

        Activity

          People

            jzemerick Jeff Zemerick
            jzemerick Jeff Zemerick
            Votes:
            1 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: