Details

    • Type: New Feature New Feature
    • Status: Closed
    • Priority: Minor Minor
    • Resolution: Won't Fix
    • Affects Version/s: None
    • Fix Version/s: None
    • Labels:
      None

      Description

      Q: Is there an EnsembleRecommender or CompoundRecommender that takes input
      from other recommender algorithms and combine them to generate better
      results?

      Ted Dunning:
      There isn't really any such thing although the SGD models are easy to glue
      together in this way.
      There is a guy named Praneet at UCI who is doing some feature sharding work
      that might relate to what you are doing. His email is
      praneetmhatre@gmail.com

      Sean Owen:
      There isn't. For the recommenders that work by computing an estimated
      preference value for items, I suppose you could average their
      estimates and rank by that.
      More crudely, you could stitch together the recommendations of
      recommender 1 and 2 by taking the top 10 amongst each of their top
      recommendations – averaging estimates where an item appears in both
      lists. That's much less work for you; it's not quite as "accurate".

      Danny Bickson:
      In terms of papers about ensemble methods/blending I suggest looking at the
      BigChaos Netflix paper:
      http://www.*netflixprize*.com/assets/*GrandPrize2009*_BPC_*BigChaos*.pdf
      See section 7.

        Activity

          People

          • Assignee:
            Unassigned
            Reporter:
            Daniel Xiaodan Zhou
          • Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development