Uploaded image for project: 'Solr'
  1. Solr
  2. SOLR-8208

DocTransformer executes sub-queries

    Details

      Description

      The initial idea was to return "from" side of query time join via doctransformer. I suppose it isn't query-time join specific, thus let to specify any query and parameters for them, let's call it sub-query. But it might be problematic to escape subquery parameters, including local ones, e.g. what if subquery needs to specify own doctransformer in &fl=[..] ?
      I suppose we can specify subquery parameter prefix:

      ..&q=name_s:john&fl=*,depts:[subquery fromIndex=departments]&
      depts.q={!term f=dept_id_s v=$row.dept_ss_dv}&depts.fl=text_t,dept_id_s_dv&depts.rows=12&depts.sort=id desc
      

      response is like

             
      <response>
      ...
          <result name="response" numFound="1" start="0">
              <doc>
                  <str name="id">1</str>
                  <str name="name_s_dv">john</str>
      ..
                  <result name="depts" numFound="2" start="0">
                      <doc>
                          <str name="dept_id_s_dv">Engineering</str>
                          <str name="text_t">These guys develop stuff</str>
                      </doc>
                      <doc>
                          <str name="dept_id_s_dv">Support</str>
                          <str name="text_t">These guys help users</str>
                      </doc>
                  </result>
              </doc>
          </result>
      </response>
      
      • fl=depts:[subquery] executes a separate request for every query result row, and adds it into a document as a separate result list. The given field name (here it's 'depts') is used as a prefix to shift subquery parameters from main query parameter, eg depts.q turns to q for subquery, depts.rows to rows.
      • document fields are available as implicit parameters with prefix row. eg. if result document has a field dept_id it can be referred as v=$row.dept_id this combines well with {!terms} query parser
      • separator=',' is used when multiple field values are combined in parameter. eg. a document has multivalue field
        dept_ids={2,3}

        , thus referring to it via

        ..&dept.q={!terms f=id v=$row.dept_ids}&..

        executes a subquery

        {!terms f=id}2,3

        . When omitted it's a comma.

      • fromIndex=othercore optional param allows to run subquery on other core, like it works on query time join
        However, it doesn't work on cloud setup (and will let you know), but it's proposed to use regular params (collection, shards - whatever, with subquery prefix as below ) to issue subquery to a collection
        q=name_s:dave&indent=true&fl=*,depts:[subquery]&rows=20&
        depts.q={!terms f=dept_id_s v=$row.dept_ss_dv}&depts.fl=text_t&
        depts.indent=true&
        depts.collection=departments&
        depts.rows=10&depts.logParamsList=q,fl,rows,row.dept_ss_dv
        

      Caveat: it should be a way slow; it handles only search result page, not entire result set.

      1. SOLR-8208.diff
        24 kB
        Cao Manh Dat
      2. SOLR-8208.patch
        84 kB
        Mikhail Khludnev
      3. SOLR-8208.patch
        79 kB
        Mikhail Khludnev
      4. SOLR-8208.patch
        79 kB
        Mikhail Khludnev
      5. SOLR-8208.patch
        56 kB
        Mikhail Khludnev
      6. SOLR-8208.patch
        50 kB
        Mikhail Khludnev
      7. SOLR-8208.patch
        48 kB
        Mikhail Khludnev
      8. SOLR-8208.patch
        47 kB
        Cao Manh Dat
      9. SOLR-8208.patch
        45 kB
        Mikhail Khludnev
      10. SOLR-8208.patch
        43 kB
        Mikhail Khludnev
      11. SOLR-8208.patch
        31 kB
        Mikhail Khludnev
      12. SOLR-8208.patch
        28 kB
        Mikhail Khludnev
      13. SOLR-8208.patch
        25 kB
        Mikhail Khludnev
      14. SOLR-8208.patch
        18 kB
        Mikhail Khludnev
      15. SOLR-8208.patch
        17 kB
        Mikhail Khludnev
      16. SOLR-8208.patch
        24 kB
        Cao Manh Dat
      17. SOLR-8208.patch
        11 kB
        Mikhail Khludnev
      18. SOLR-8208.patch
        11 kB
        Cao Manh Dat
      19. SOLR-8208.patch
        10 kB
        Cao Manh Dat
      20. SOLR-8208-distrib-test-fix.patch
        1 kB
        Mikhail Khludnev

        Issue Links

          Activity

          Hide
          upayavira Upayavira added a comment -

          Something I've wanted for a long time - you've got my vote.

          Why should it be way slow? Obviously for a very large number of docs in the result set it would get resource intensive, but for medium numbers it should be quite acceptable, no?

          Another feature that would make this really neat would be to specify the tag of a filter query that you want to use as your sub-query - assuming it includes a join query. Thus, you would only need to specify your join query once - including fromIndex, join fields, etc - all you would need to specify would be the fields you want to include:

          fq={!tag=join}{!join fromIndex=other from=id to=id}some_query&
          fl=*,[subquery fq.tag=join fl=field1,field2]
          
          Show
          upayavira Upayavira added a comment - Something I've wanted for a long time - you've got my vote. Why should it be way slow? Obviously for a very large number of docs in the result set it would get resource intensive, but for medium numbers it should be quite acceptable, no? Another feature that would make this really neat would be to specify the tag of a filter query that you want to use as your sub-query - assuming it includes a join query. Thus, you would only need to specify your join query once - including fromIndex, join fields, etc - all you would need to specify would be the fields you want to include: fq={!tag=join}{!join fromIndex=other from=id to=id}some_query& fl=*,[subquery fq.tag=join fl=field1,field2]
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          Something I've wanted for a long time - you've got my vote.

          Thanks! Appreciate. but it's only "watch", not vote!

          but for medium numbers it should be quite acceptable,

          that's what I meant.

          fq={!tag=join}

          • this approach is limited by join subquery that's might not be bad, actually, but join query might be really expensive, eg it doesn't leapfrog well;
          • also in this case join subquery should be turned: "from"<->"to", that's might be tricky;
          • it seems limited to score less join, but now there is a score-emitting twin!
          • I'm really afraid of parsing splitting:
            fl=*,[subquery fq.tag=join fl=field1,field2,[transf param=etc]]
            
          Show
          mkhludnev Mikhail Khludnev added a comment - Something I've wanted for a long time - you've got my vote. Thanks! Appreciate. but it's only "watch", not vote! but for medium numbers it should be quite acceptable, that's what I meant. fq={!tag=join} this approach is limited by join subquery that's might not be bad, actually, but join query might be really expensive, eg it doesn't leapfrog well; also in this case join subquery should be turned: "from"<->"to", that's might be tricky; it seems limited to score less join, but now there is a score-emitting twin! I'm really afraid of parsing splitting: fl=*,[subquery fq.tag=join fl=field1,field2,[transf param=etc]]
          Hide
          caomanhdat Cao Manh Dat added a comment -

          Mikhail Khludnev I really want to work on this issue, can you assign this issue to me?

          Show
          caomanhdat Cao Manh Dat added a comment - Mikhail Khludnev I really want to work on this issue, can you assign this issue to me?
          Hide
          erickerickson Erick Erickson added a comment -

          Hmm, for some reason I can't seem to assign it to you. While we figure that out, you could simply start working on it anyway without it being assigned to you...

          Can you click on the "more" button at the top and have an option to "upload files"? If so, you can freely upload patches..

          BTW, I don't know if you know this already, but usually we work against trunk. Then whoever commits the patch will merge it back to 5x.

          Show
          erickerickson Erick Erickson added a comment - Hmm, for some reason I can't seem to assign it to you. While we figure that out, you could simply start working on it anyway without it being assigned to you... Can you click on the "more" button at the top and have an option to "upload files"? If so, you can freely upload patches.. BTW, I don't know if you know this already, but usually we work against trunk. Then whoever commits the patch will merge it back to 5x.
          Hide
          caomanhdat Cao Manh Dat added a comment -

          Erick EricksonThanks Erick, I can upload file so will update work process through patch file.

          Show
          caomanhdat Cao Manh Dat added a comment - Erick Erickson Thanks Erick, I can upload file so will update work process through patch file.
          Hide
          caomanhdat Cao Manh Dat added a comment - - edited

          Initial patch,
          I change the API a little bit to make it easier to parse.

          [subquery f=fromField t=toField v=value start=0 rows=10]
          

          The result so far.
          Input

          doc("id", "4","name_s", "dave", "title_s", "MTS", "dept_ss_dv","Support", "dept_ss_dv","Engineering"))
          
          doc("id","10", "dept_id_s", "Engineering", "text_t","These guys develop stuff", "salary_i_dv", "1000")
          doc("id","13", "dept_id_s", "Support", "text_t","These guys help customers","salary_i_dv", "800")
          

          Query

          q=name_s:dave&fl=*,[subquery f=dept_ss_dv t=dept_id_s v=depts]
          

          Output

          {
            "id": "4",
            "name_s_dv": "dave",
            "title_s_dv": "MTS",
            "dept_ss_dv": [
              "Support",
              "Engineering"
            ],
            "depts": [
              {
                "id": "10",
                "dept_id_s_dv": "Engineering",
                "text_t": "These guys develop stuff",
                "salary_i_dv": 1000
              },
              {
                "id": "13",
                "dept_id_s_dv": "Support",
                "text_t": "These guys help customers",
                "salary_i_dv": 800
              }
            ]
          }
          

          Managing to work on sort and fl params. Am i on right track?

          Show
          caomanhdat Cao Manh Dat added a comment - - edited Initial patch, I change the API a little bit to make it easier to parse. [subquery f=fromField t=toField v=value start=0 rows=10] The result so far. Input doc( "id" , "4" , "name_s" , "dave" , "title_s" , "MTS" , "dept_ss_dv" , "Support" , "dept_ss_dv" , "Engineering" )) doc( "id" , "10" , "dept_id_s" , "Engineering" , "text_t" , "These guys develop stuff" , "salary_i_dv" , "1000" ) doc( "id" , "13" , "dept_id_s" , "Support" , "text_t" , "These guys help customers" , "salary_i_dv" , "800" ) Query q=name_s:dave&fl=*,[subquery f=dept_ss_dv t=dept_id_s v=depts] Output { "id" : "4" , "name_s_dv" : "dave" , "title_s_dv" : "MTS" , "dept_ss_dv" : [ "Support" , "Engineering" ], "depts" : [ { "id" : "10" , "dept_id_s_dv" : "Engineering" , "text_t" : "These guys develop stuff" , "salary_i_dv" : 1000 }, { "id" : "13" , "dept_id_s_dv" : "Support" , "text_t" : "These guys help customers" , "salary_i_dv" : 800 } ] } Managing to work on sort and fl params. Am i on right track?
          Hide
          upayavira Upayavira added a comment -

          This is useful stuff. Much needed. I assume this would work as well on block joins and alongside pseudo joins, which I think I'm seeing above?

          Traditionally, in a local params query parser, the parameter v refers to the actual query string, so:

          q={!lucene v=$qq}&qq=field:(my search) 

          would be a valid syntax. I would suggest using n= (for name) or tag= for the field name of the newly created field to avoid association with this v= syntax.

          Is a lookup based upon the ID of a field in the current document sufficient? I suspect it is.

          Do you also support fromIndex - that is, executing the query against another core or collection? That would be the killer feature.

          As to the fq=

          {!tag=join} {!join blah....}

          syntax, if you had [subquery fq=join], you wouldn't actually execute the join query, you would just locate the query object, and extract its key parameters to avoid the user from having to enter them multiple times. Having both options would be super cool.

          Show
          upayavira Upayavira added a comment - This is useful stuff. Much needed. I assume this would work as well on block joins and alongside pseudo joins, which I think I'm seeing above? Traditionally, in a local params query parser, the parameter v refers to the actual query string, so: q={!lucene v=$qq}&qq=field:(my search) would be a valid syntax. I would suggest using n= (for name) or tag= for the field name of the newly created field to avoid association with this v= syntax. Is a lookup based upon the ID of a field in the current document sufficient? I suspect it is. Do you also support fromIndex - that is, executing the query against another core or collection? That would be the killer feature. As to the fq= {!tag=join} {!join blah....} syntax, if you had [subquery fq=join] , you wouldn't actually execute the join query, you would just locate the query object, and extract its key parameters to avoid the user from having to enter them multiple times. Having both options would be super cool.
          Hide
          caomanhdat Cao Manh Dat added a comment - - edited

          I changed the api back like Mikhail's suggestion

          Result so far.
          Query

          q=name_s:dave
          &fl=*,[subquery prefix=subq1 name=depts]
          &subq1.q={!term f=dept_id_s v=$subq1.row.dept_ss_dv}
          

          Input

          doc("id", "4","name_s", "dave", "title_s", "MTS", "dept_ss_dv","Support", "dept_ss_dv","Engineering"))
          
          doc("id","10", "dept_id_s", "Engineering", "text_t","These guys develop stuff", "salary_i_dv", "1000")
          doc("id","13", "dept_id_s", "Support", "text_t","These guys help customers","salary_i_dv", "800")
          

          Result

          {
            "id": "4",
            "name_s_dv": "dave",
            "title_s_dv": "MTS",
            "dept_ss_dv": [
              "Support",
              "Engineering"
            ],
            "depts": [
              {
                "id": "13",
                "dept_id_s_dv": "Support",
                "text_t": "These guys help customers",
                "salary_i_dv": 800
              }
            ]
          }
          

          I just have one question. What should we do when from field have multiple values? Should we change the

          subq1.q

          to

          {!term f=dept_id_s v="Support Engineer"}
          

          I will submit the patch frequently to keep me on track. Hope that it not bother people.

          Show
          caomanhdat Cao Manh Dat added a comment - - edited I changed the api back like Mikhail's suggestion Result so far. Query q=name_s:dave &fl=*,[subquery prefix=subq1 name=depts] &subq1.q={!term f=dept_id_s v=$subq1.row.dept_ss_dv} Input doc( "id" , "4" , "name_s" , "dave" , "title_s" , "MTS" , "dept_ss_dv" , "Support" , "dept_ss_dv" , "Engineering" )) doc( "id" , "10" , "dept_id_s" , "Engineering" , "text_t" , "These guys develop stuff" , "salary_i_dv" , "1000" ) doc( "id" , "13" , "dept_id_s" , "Support" , "text_t" , "These guys help customers" , "salary_i_dv" , "800" ) Result { "id" : "4" , "name_s_dv" : "dave" , "title_s_dv" : "MTS" , "dept_ss_dv" : [ "Support" , "Engineering" ], "depts" : [ { "id" : "13" , "dept_id_s_dv" : "Support" , "text_t" : "These guys help customers" , "salary_i_dv" : 800 } ] } I just have one question. What should we do when from field have multiple values? Should we change the subq1.q to {!term f=dept_id_s v= "Support Engineer" } I will submit the patch frequently to keep me on track. Hope that it not bother people.
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          Do you also support fromIndex - that is, executing the query against another core or collection? That would be the killer feature.

          great idea. Let me spawn a sub-task.

          Show
          mkhludnev Mikhail Khludnev added a comment - Do you also support fromIndex - that is, executing the query against another core or collection? That would be the killer feature. great idea. Let me spawn a sub-task.
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          What should we do when from field have multiple values?

          I prefer to forget about it, until we have real life challenge from someone. So, far support single value fields.

          Show
          mkhludnev Mikhail Khludnev added a comment - What should we do when from field have multiple values? I prefer to forget about it, until we have real life challenge from someone. So, far support single value fields.
          Hide
          caomanhdat Cao Manh Dat added a comment - - edited

          Thanks Mikhail, It will make thing more easier. I also consider about distributing the sub-queries, so I'm trying to do this (execute the subquery through solrCore)

          SolrCore solrCore = subQueryRequest.getCore();
          SolrQueryResponse response = new SolrQueryResponse();
          solrCore.execute(solrCore.getRequestHandler(null), subQueryRequest, response);
          DocsStreamer docsStreamer = new DocsStreamer((ResultContext) response.getValues().get("response"));
          

          But i'm afraid that it will mess up the logic inside SolrCore.execute

          Show
          caomanhdat Cao Manh Dat added a comment - - edited Thanks Mikhail, It will make thing more easier. I also consider about distributing the sub-queries, so I'm trying to do this (execute the subquery through solrCore) SolrCore solrCore = subQueryRequest.getCore(); SolrQueryResponse response = new SolrQueryResponse(); solrCore.execute(solrCore.getRequestHandler( null ), subQueryRequest, response); DocsStreamer docsStreamer = new DocsStreamer((ResultContext) response.getValues().get( "response" )); But i'm afraid that it will mess up the logic inside SolrCore.execute
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          I added a couple of assertions.

          I suppose the last snippet makes much sense, just because scoring query parsers do [something|
          https://github.com/apache/lucene-solr/blob/trunk/solr/core/src/java/org/apache/solr/search/join/ScoreJoinQParserPlugin.java#L95] like this.

          Show
          mkhludnev Mikhail Khludnev added a comment - I added a couple of assertions. I suppose the last snippet makes much sense, just because scoring query parsers do [something| https://github.com/apache/lucene-solr/blob/trunk/solr/core/src/java/org/apache/solr/search/join/ScoreJoinQParserPlugin.java#L95 ] like this.
          Hide
          caomanhdat Cao Manh Dat added a comment -

          Mikhail Khludnev Please review my patch. I thinks it quite ok now.

          Show
          caomanhdat Cao Manh Dat added a comment - Mikhail Khludnev Please review my patch. I thinks it quite ok now.
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          don't worry. both of your patches in my commit queue. It just takes some time.

          Show
          mkhludnev Mikhail Khludnev added a comment - don't worry. both of your patches in my commit queue . It just takes some time.
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          Hello Cao Manh Dat,
          I terribly sorry for such late review.

          I applied the recent patch to master. There was some compile problems with StoredField or, so. I can't pass units.
          TestSubQueryTransformer.testSubQuerytransformer()

           assertQ("subq1.fl is limited to single field",
                req("q","name_s:john",
                  "fl","*,depts:[subquery prefix=subq1]", "subq1.q","{!term f=dept_id_s v=$subq1.row.dept_ss_dv}", "subq1.fl","text_t"),
                "//result/doc/str[@name='name_s_dv'][.='john']/../arr[@name='depts']/doc/str[@name='text_t'][.='These guys develop stuff']",
                "count(//result/doc/str[@name='name_s_dv'][.='john']/../arr[@name='depts']/doc/*)=1");// only text_t
          
           REQUEST FAILED: xpath=count(//result/doc/str[@name='name_s_dv'][.='john']/../arr[@name='depts']/doc/*)=1
          <response>
          <lst name="responseHeader"><int name="status">0</int><int name="QTime">15474</int></lst><result name="response" numFound="1" start="0">
          
          <doc><str name="id">1</str>
                  <str name="name_s_dv">john</str>
                  <str name="title_s_dv">Director</str>
                  <arr name="dept_ss_dv"><str>Engineering</str></arr>
                  <arr name="depts">
                           <doc><str name="text_t">These guys develop stuff</str></doc>
                           <doc><str name="text_t">These guys develop other stuff</str></doc>
                           <doc><str name="text_t">These guys develop manage other engineers</str></doc></arr>
                 </doc>
          </result>
          </response>
          

          this xpath matches three times, not once as it's asserted

          <str name="text_t">These guys develop stuff</str>
          <str name="text_t">These guys develop other stuff</str>
          <str name="text_t">These guys develop manage other engineers</str>
          

          Can't it happen as that there should be subq1.rows=1? Can you make sure that tests pass?

          Also, I didn't get deep yet, but I noticed some request/closable infrastructure, it's wise to avoid obtaining core on very transformation, but couldn't it be achieved via SolrRequestInfo.addCloseHook(Closeable) ?

          the also thing which I want to think about is to avoid code with regexp replace/field types and toExternal, I have an idea to use already resolved document fields as parameters view with keys prepended with "subq1.row." This can be used by chaining like DefaultSolrParams.

          Show
          mkhludnev Mikhail Khludnev added a comment - Hello Cao Manh Dat , I terribly sorry for such late review. I applied the recent patch to master. There was some compile problems with StoredField or, so. I can't pass units. TestSubQueryTransformer.testSubQuerytransformer() assertQ( "subq1.fl is limited to single field" , req( "q" , "name_s:john" , "fl" , "*,depts:[subquery prefix=subq1]" , "subq1.q" , "{!term f=dept_id_s v=$subq1.row.dept_ss_dv}" , "subq1.fl" , "text_t" ), " //result/doc/str[@name='name_s_dv'][.='john']/../arr[@name='depts']/doc/str[@name='text_t'][.='These guys develop stuff']" , "count( //result/doc/str[@name='name_s_dv'][.='john']/../arr[@name='depts']/doc/*)=1" );// only text_t REQUEST FAILED: xpath=count(//result/doc/str[@name='name_s_dv'][.='john']/../arr[@name='depts']/doc/*)=1 <response> <lst name="responseHeader"><int name="status">0</int><int name="QTime">15474</int></lst><result name="response" numFound="1" start="0"> <doc><str name="id">1</str> <str name="name_s_dv">john</str> <str name="title_s_dv">Director</str> <arr name="dept_ss_dv"><str>Engineering</str></arr> <arr name="depts"> <doc><str name="text_t">These guys develop stuff</str></doc> <doc><str name="text_t">These guys develop other stuff</str></doc> <doc><str name="text_t">These guys develop manage other engineers</str></doc></arr> </doc> </result> </response> this xpath matches three times, not once as it's asserted <str name="text_t">These guys develop stuff</str> <str name="text_t">These guys develop other stuff</str> <str name="text_t">These guys develop manage other engineers</str> Can't it happen as that there should be subq1.rows=1 ? Can you make sure that tests pass? Also, I didn't get deep yet, but I noticed some request/closable infrastructure, it's wise to avoid obtaining core on very transformation, but couldn't it be achieved via SolrRequestInfo.addCloseHook(Closeable) ? the also thing which I want to think about is to avoid code with regexp replace/field types and toExternal, I have an idea to use already resolved document fields as parameters view with keys prepended with "subq1.row." This can be used by chaining like DefaultSolrParams .
          Hide
          caomanhdat Cao Manh Dat added a comment -

          Hi Mikhail,

          I updated the code with lastest code in trunk. All tests are passed now.

          Show
          caomanhdat Cao Manh Dat added a comment - Hi Mikhail, I updated the code with lastest code in trunk. All tests are passed now.
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          I moved patch to existing closeables. Now I'm looking into change in DocStreamer and trying to avoid it.

          Show
          mkhludnev Mikhail Khludnev added a comment - I moved patch to existing closeables. Now I'm looking into change in DocStreamer and trying to avoid it.
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          attaching a patch which pass existing tests. now it avoids changes in DocStreamer and SolrQueryRequest. As a detail, subquery results are represents with <result>, but not <arr>

          <response>
          
          <lst name="responseHeader">
            <int name="status">0</int>
            <int name="QTime">650</int>
          </lst>
          <result name="response" numFound="1" start="0">
            <doc>
              <str name="id">1</str>
              <str name="name_s_dv">john</str>
              <str name="title_s_dv">Director</str>
              <arr name="dept_ss_dv">
                <str>Engineering</str>
              </arr>
              <result name="depts" numFound="1" start="0">
                <doc>
                  <str name="text_t">These guys develop stuff</str></doc>
              </result></doc>
          </result>
          </response>
          

          How do you feel about it?

          Show
          mkhludnev Mikhail Khludnev added a comment - attaching a patch which pass existing tests. now it avoids changes in DocStreamer and SolrQueryRequest. As a detail, subquery results are represents with <result>, but not <arr> <response> <lst name= "responseHeader" > < int name= "status" >0</ int > < int name= "QTime" >650</ int > </lst> <result name= "response" numFound= "1" start= "0" > <doc> <str name= "id" >1</str> <str name= "name_s_dv" >john</str> <str name= "title_s_dv" >Director</str> <arr name= "dept_ss_dv" > <str>Engineering</str> </arr> <result name= "depts" numFound= "1" start= "0" > <doc> <str name= "text_t" >These guys develop stuff</str></doc> </result></doc> </result> </response> How do you feel about it?
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          SOLR-8208.patch here is how I'd like to handle parameters substitution.
          Note, the patch might in in really early stages (might not even work on may docs, etc).
          Subquery still accepts only single value doc fields as a parameter, how do you prefer handle multivalue field, if you do?

          Show
          mkhludnev Mikhail Khludnev added a comment - SOLR-8208.patch here is how I'd like to handle parameters substitution. Note, the patch might in in really early stages (might not even work on may docs, etc). Subquery still accepts only single value doc fields as a parameter, how do you prefer handle multivalue field, if you do?
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          I just got an idea how to handle multivalue fields, there should be [subquery prefix=products mv-delim=,] thus muliple values are concatenated and can be used as an input for

          ...&products.q={!terms separator=, v=$products.row.id} 

          WDYT?

          Show
          mkhludnev Mikhail Khludnev added a comment - I just got an idea how to handle multivalue fields, there should be [subquery prefix=products mv-delim=,] thus muliple values are concatenated and can be used as an input for ...&products.q={!terms separator=, v=$products.row.id} WDYT?
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          made some severe improvements

          Show
          mkhludnev Mikhail Khludnev added a comment - made some severe improvements
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          attaching the shuffled tests.
          revealed a design gap - even if doc field isn't referred in subquery, because solr eagerly copies params. That's a pity.

          org.apache.solr.common.SolrException: SubQuery depts cant substitute  multiple values [stored,indexed,toke via parameter "subq1.row.dept_ss_dv" for document SolrDocument{id=stor
          
          SubQueryAugmenter$DocParams.get(String) line: 172	
          SubQueryAugmenter$DocParams.getParams(String) line: 139	
          DefaultSolrParams.getParams(String) line: 44	
          MultiMapSolrParams.asMultiMap(SolrParams, boolean) line: 103	
          RequestUtil.processParams(SolrRequestHandler, SolrQueryRequest, SolrParams, SolrParams, SolrParams) line: 104	
          SolrPluginUtils.setDefaults(SolrRequestHandler, SolrQueryRequest, SolrParams, SolrParams, SolrParams) line: 176	
          SearchHandler(RequestHandlerBase).handleRequest(SolrQueryRequest, SolrQueryResponse) line: 152	
          SolrCore.execute(SolrRequestHandler, SolrQueryRequest, SolrQueryResponse) line: 2033	
          SubQueryAugmenter.transform(SolrDocument, int, float) line: 226	
          DocsStreamer.next() line: 146	
          ...
          

          Multivalue fields substitution should be implemented before it moves further.

          Show
          mkhludnev Mikhail Khludnev added a comment - attaching the shuffled tests. revealed a design gap - even if doc field isn't referred in subquery, because solr eagerly copies params. That's a pity. org.apache.solr.common.SolrException: SubQuery depts cant substitute multiple values [stored,indexed,toke via parameter "subq1.row.dept_ss_dv" for document SolrDocument{id=stor SubQueryAugmenter$DocParams.get(String) line: 172 SubQueryAugmenter$DocParams.getParams(String) line: 139 DefaultSolrParams.getParams(String) line: 44 MultiMapSolrParams.asMultiMap(SolrParams, boolean) line: 103 RequestUtil.processParams(SolrRequestHandler, SolrQueryRequest, SolrParams, SolrParams, SolrParams) line: 104 SolrPluginUtils.setDefaults(SolrRequestHandler, SolrQueryRequest, SolrParams, SolrParams, SolrParams) line: 176 SearchHandler(RequestHandlerBase).handleRequest(SolrQueryRequest, SolrQueryResponse) line: 152 SolrCore.execute(SolrRequestHandler, SolrQueryRequest, SolrQueryResponse) line: 2033 SubQueryAugmenter.transform(SolrDocument, int, float) line: 226 DocsStreamer.next() line: 146 ... Multivalue fields substitution should be implemented before it moves further.
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          attaching an implementation with multivalue fix and tests.

          Reveal a potential usability usage: Implicit subquery params (doc values) are not logged:

           params={q=name_s:dave&subq1.rows=6&indent=true&fl=*,depts:[subquery+prefix%3Dsubq1+separator%3D"+"]&rows=3&subq1.indent=true&wt=xml&subq1.fl=text_t&subq1.q={!lucene+df%3Ddept_id_s+v%3D$subq1.row.dept_ss_dv}} hits=3 status=0 QTime=2
           params={rows=6&indent=true&fl=text_t&q={!lucene+df%3Ddept_id_s+v%3D$subq1.row.dept_ss_dv}} hits=6 status=0 QTime=4
           params={rows=6&indent=true&fl=text_t&q={!lucene+df%3Ddept_id_s+v%3D$subq1.row.dept_ss_dv}} hits=6 status=0 QTime=7
           params={rows=6&indent=true&fl=text_t&q={!lucene+df%3Ddept_id_s+v%3D$subq1.row.dept_ss_dv}} hits=6 status=0 QTime=10
          

          the first line is a main query, and the latter are subqueries. here you see that and actual value of subq1.row.dept_ss_dv are not logged, but it might by deadly needful.

          What's you prefer tweak params somehow for logging, or allow debugQuery=true for subqueries, and inject it into main query debug?

          Show
          mkhludnev Mikhail Khludnev added a comment - attaching an implementation with multivalue fix and tests. Reveal a potential usability usage: Implicit subquery params (doc values) are not logged: params={q=name_s:dave&subq1.rows=6&indent=true&fl=*,depts:[subquery+prefix%3Dsubq1+separator%3D"+"]&rows=3&subq1.indent=true&wt=xml&subq1.fl=text_t&subq1.q={!lucene+df%3Ddept_id_s+v%3D$subq1.row.dept_ss_dv}} hits=3 status=0 QTime=2 params={rows=6&indent=true&fl=text_t&q={!lucene+df%3Ddept_id_s+v%3D$subq1.row.dept_ss_dv}} hits=6 status=0 QTime=4 params={rows=6&indent=true&fl=text_t&q={!lucene+df%3Ddept_id_s+v%3D$subq1.row.dept_ss_dv}} hits=6 status=0 QTime=7 params={rows=6&indent=true&fl=text_t&q={!lucene+df%3Ddept_id_s+v%3D$subq1.row.dept_ss_dv}} hits=6 status=0 QTime=10 the first line is a main query, and the latter are subqueries. here you see that and actual value of subq1.row.dept_ss_dv are not logged, but it might by deadly needful. What's you prefer tweak params somehow for logging, or allow debugQuery=true for subqueries, and inject it into main query debug?
          Hide
          mkhludnev Mikhail Khludnev added a comment - - edited

          it seems like param tracing can be done with logParamsList

          ...Request [collection1]  webapp=null path=null params={q=name_s:dave&subq1.rows=6&indent=true&fl=*,depts:[subquery+prefix%3Dsubq1+]&rows=2&subq1.indent=true&subq1.logParamsList=q,fl,rows,subq1.row.dept_ss_dv&wt=xml&subq1.fl=text_t&subq1.q={!terms+f%3Ddept_id_s+v%3D$subq1.row.dept_ss_dv+separator%3D,}} hits=2 status=0 QTime=9979
          ...Request [collection1]  webapp=null path=null params={q={!terms+f%3Ddept_id_s+v%3D$subq1.row.dept_ss_dv+separator%3D,}&subq1.row.dept_ss_dv=Support,Engineering&fl=text_t&rows=6} hits=6 status=0 QTime=60740
          
          Show
          mkhludnev Mikhail Khludnev added a comment - - edited it seems like param tracing can be done with logParamsList ...Request [collection1] webapp=null path=null params={q=name_s:dave&subq1.rows=6&indent=true&fl=*,depts:[subquery+prefix%3Dsubq1+]&rows=2&subq1.indent=true&subq1.logParamsList=q,fl,rows,subq1.row.dept_ss_dv&wt=xml&subq1.fl=text_t&subq1.q={!terms+f%3Ddept_id_s+v%3D$subq1.row.dept_ss_dv+separator%3D,}} hits=2 status=0 QTime=9979 ...Request [collection1] webapp=null path=null params={q={!terms+f%3Ddept_id_s+v%3D$subq1.row.dept_ss_dv+separator%3D,}&subq1.row.dept_ss_dv=Support,Engineering&fl=text_t&rows=6} hits=6 status=0 QTime=60740
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          attaching a test with logParamsList and json assert

          Show
          mkhludnev Mikhail Khludnev added a comment - attaching a test with logParamsList and json assert
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          Added cloud test. As expected it doesn't pass. Keep going.

          Show
          mkhludnev Mikhail Khludnev added a comment - Added cloud test. As expected it doesn't pass. Keep going.
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          Making subquery call via embedded solr server, I believe it help to handle solr cloud case easier.

          Show
          mkhludnev Mikhail Khludnev added a comment - Making subquery call via embedded solr server, I believe it help to handle solr cloud case easier.
          Hide
          caomanhdat Cao Manh Dat added a comment -

          Make distrib test pass.

          Show
          caomanhdat Cao Manh Dat added a comment - Make distrib test pass.
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          Cao Manh Dat, your last patch is awesome!!!
          for the reference, attempt to request cloud collection via EmbeddedSolrServer were too naive:

          ERROR (qtp1310704163-67) [n:127.0.0.1:65356_ c:people s:shard1 r:core_node2 x:people_shard1_replica1] o.a.s.s.HttpSolrCall null:org.apache.solr.common.SolrException: No such core: departments
          	at org.apache.solr.client.solrj.embedded.EmbeddedSolrServer.request(EmbeddedSolrServer.java:149)
          	at org.apache.solr.response.transform.LocalSubQueryAugmenter.transform(SubQueryAugmenterFactory.java:239)
          	at org.apache.solr.response.DocsStreamer.next(DocsStreamer.java:146)
          	at org.apache.solr.response.DocsStreamer.next(DocsStreamer.java:1)
          
          Show
          mkhludnev Mikhail Khludnev added a comment - Cao Manh Dat , your last patch is awesome!!! for the reference, attempt to request cloud collection via EmbeddedSolrServer were too naive: ERROR (qtp1310704163-67) [n:127.0.0.1:65356_ c:people s:shard1 r:core_node2 x:people_shard1_replica1] o.a.s.s.HttpSolrCall null:org.apache.solr.common.SolrException: No such core: departments at org.apache.solr.client.solrj.embedded.EmbeddedSolrServer.request(EmbeddedSolrServer.java:149) at org.apache.solr.response.transform.LocalSubQueryAugmenter.transform(SubQueryAugmenterFactory.java:239) at org.apache.solr.response.DocsStreamer.next(DocsStreamer.java:146) at org.apache.solr.response.DocsStreamer.next(DocsStreamer.java:1)
          Hide
          mkhludnev Mikhail Khludnev added a comment -
          • removed call via cloud client, turns out we can request it via EmbeddedServer explicitly specifying collection=, I'm not sure it's better, but it's just neat at least (which means better, most times)
          • I introduced thread executor, there are two points behind it: it's allows to don't restore SolrRequestInfo threadlocal every request, however it's necessary to clean it before request, because it's inherited; in future it will be used to enrich docs in parallel, but under another ticket;

          opinions?

          • I reconstruct ResultContext to pass numFound and start for subquery result, it might be redundant but seems reasonable
          • I added empty test method for remaining cases
          • I want to challenge $subq.doc.field can't prefix be removed?
          • it's also make sense to show how [subquery] compete with [child] ig order result or so.
          Show
          mkhludnev Mikhail Khludnev added a comment - removed call via cloud client, turns out we can request it via EmbeddedServer explicitly specifying collection= , I'm not sure it's better, but it's just neat at least (which means better, most times) I introduced thread executor, there are two points behind it: it's allows to don't restore SolrRequestInfo threadlocal every request, however it's necessary to clean it before request, because it's inherited; in future it will be used to enrich docs in parallel, but under another ticket; opinions? I reconstruct ResultContext to pass numFound and start for subquery result, it might be redundant but seems reasonable I added empty test method for remaining cases I want to challenge $subq.doc.field can't prefix be removed? it's also make sense to show how [subquery] compete with [child] ig order result or so.
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          I think it's almost ready. I'll post the final syntax in the description above. Note fore reviewers. It introduces thread pool executor, but use it for sequential invocations for a while. Change in MLT is just line move no impact at all.
          The last test I would like to add is just demonstrate how [subquery] can be used instead of [child].
          My plan is to commit it next week. Concerns?

          Show
          mkhludnev Mikhail Khludnev added a comment - I think it's almost ready. I'll post the final syntax in the description above. Note fore reviewers. It introduces thread pool executor, but use it for sequential invocations for a while. Change in MLT is just line move no impact at all. The last test I would like to add is just demonstrate how [subquery] can be used instead of [child]. My plan is to commit it next week. Concerns?
          Hide
          dsmiley David Smiley added a comment -

          I looked at it extremely briefly and just want to say you did a nice thorough job of testing. Hopefully someone will have more time to look more carefully, but I'm too busy.

          Show
          dsmiley David Smiley added a comment - I looked at it extremely briefly and just want to say you did a nice thorough job of testing. Hopefully someone will have more time to look more carefully, but I'm too busy.
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          Appreciate. Take care!

          Show
          mkhludnev Mikhail Khludnev added a comment - Appreciate. Take care!
          Hide
          caomanhdat Cao Manh Dat added a comment -

          +1 The last patch sound great! Turn out that SearchHandler distribute the search for us.

          Show
          caomanhdat Cao Manh Dat added a comment - +1 The last patch sound great! Turn out that SearchHandler distribute the search for us.
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          hmm... you might be laughing, but game is over.
          The problem is to call EmbeddedSolrServer from DocTransformer. This call isn't possible from thread where SolrRequestInfo is present. So far it's not possible to suspend and then resume SolrRequestInfo. Juggling with threads complicates it a lot, however it's possible to manage threads after all, but not SolrRequestInfo.

          Yonik Seeley can you suggest an approach?

          Show
          mkhludnev Mikhail Khludnev added a comment - hmm... you might be laughing, but game is over. The problem is to call EmbeddedSolrServer from DocTransformer. This call isn't possible from thread where SolrRequestInfo is present. So far it's not possible to suspend and then resume SolrRequestInfo. Juggling with threads complicates it a lot, however it's possible to manage threads after all, but not SolrRequestInfo. Yonik Seeley can you suggest an approach?
          Hide
          ryantxu Ryan McKinley added a comment -

          Did you try something like:

              SolrRequestInfo orig = SolrRequestInfo.getRequestInfo();
              try {
                SolrRequestInfo.clearRequestInfo();
                
                // TODO, make whatever call you need
              }
              finally {
                SolrRequestInfo.setRequestInfo(orig);
              }
          
          Show
          ryantxu Ryan McKinley added a comment - Did you try something like: SolrRequestInfo orig = SolrRequestInfo.getRequestInfo(); try { SolrRequestInfo.clearRequestInfo(); // TODO, make whatever call you need } finally { SolrRequestInfo.setRequestInfo(orig); }
          Hide
          caomanhdat Cao Manh Dat added a comment - - edited

          Ryan McKinley I think the above code is quite dangerous because the original SolrRequestInfo can have some close hooks and the call to

           SolrRequestInfo.clearRequestInfo()

          will close all the hooks.

          Mikhail Khludnev I'm not laughing at all, i think it is a clever idea to handle swap out/swap in SolrRequestInfo (without modify SolrRequestInfo class) . I propose another approach, that change SolrRequestInfo a little bit.

          public static void doActionInEmptyRequestInfo(Action action) throws IOException {
              SolrRequestInfo prev = threadLocal.get();
              threadLocal.remove();
              try {
                action.doAction();
                SolrRequestInfo current = threadLocal.get();
                if (current != null) {
                  log.error("New SolrRequestInfo was not closed!  req=" + current.req.getOriginalParams().toString());
                }
                assert current == null;
              } finally {
                threadLocal.set(prev);
              }
            }
          
            public interface Action {
              void doAction() throws IOException;
            }
          
          Show
          caomanhdat Cao Manh Dat added a comment - - edited Ryan McKinley I think the above code is quite dangerous because the original SolrRequestInfo can have some close hooks and the call to SolrRequestInfo.clearRequestInfo() will close all the hooks. Mikhail Khludnev I'm not laughing at all, i think it is a clever idea to handle swap out/swap in SolrRequestInfo (without modify SolrRequestInfo class) . I propose another approach, that change SolrRequestInfo a little bit. public static void doActionInEmptyRequestInfo(Action action) throws IOException { SolrRequestInfo prev = threadLocal.get(); threadLocal.remove(); try { action.doAction(); SolrRequestInfo current = threadLocal.get(); if (current != null ) { log.error( "New SolrRequestInfo was not closed! req=" + current.req.getOriginalParams().toString()); } assert current == null ; } finally { threadLocal.set(prev); } } public interface Action { void doAction() throws IOException; }
          Hide
          mkhludnev Mikhail Khludnev added a comment - - edited
          • Good news! Cao Manh Dat your approach laid quite well! For you know sake we have a backdoor to suspend SolrRequestInfo. I wonder if it legal enough?
          • threads were removed (I'll comment about a pain, which those who want to get, can get with it).
          • added a few tests proving that [subquery] is on par with [child]
          • moved tests in a subpackage
          • one question about code style: the core class (300 LOC) is compiled into more than five classes, doesn't it deserve a separate o.a.s.response.transform.subquery ?
          Show
          mkhludnev Mikhail Khludnev added a comment - - edited Good news! Cao Manh Dat your approach laid quite well! For you know sake we have a backdoor to suspend SolrRequestInfo. I wonder if it legal enough? threads were removed (I'll comment about a pain, which those who want to get, can get with it). added a few tests proving that [subquery] is on par with [child] moved tests in a subpackage one question about code style: the core class (300 LOC) is compiled into more than five classes, doesn't it deserve a separate o.a.s.response.transform.subquery ?
          Hide
          caomanhdat Cao Manh Dat added a comment - - edited

          Great patch! I think doInSuspension is good (better than swap in/swap out try catch) and we should add doInSuspension to SolrRequestInfo (to prevent anyone who try to do swap in/swap out in the future).

          Show
          caomanhdat Cao Manh Dat added a comment - - edited Great patch! I think doInSuspension is good (better than swap in/swap out try catch) and we should add doInSuspension to SolrRequestInfo (to prevent anyone who try to do swap in/swap out in the future).
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          added changes, committing in two days.

          Show
          mkhludnev Mikhail Khludnev added a comment - added changes, committing in two days.
          Hide
          jira-bot ASF subversion and git services added a comment -

          Commit 7571e747c3506ee93d63c9bd3534254944b5caa7 in lucene-solr's branch refs/heads/master from Mikhail Khludnev
          [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=7571e74 ]

          SOLR-8208: [subquery] document transformer executes separate requests per result document.

          Show
          jira-bot ASF subversion and git services added a comment - Commit 7571e747c3506ee93d63c9bd3534254944b5caa7 in lucene-solr's branch refs/heads/master from Mikhail Khludnev [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=7571e74 ] SOLR-8208 : [subquery] document transformer executes separate requests per result document.
          Hide
          jira-bot ASF subversion and git services added a comment -

          Commit 75a84b7d2d0d8f0ed36efad4306e5a938cae3a2a in lucene-solr's branch refs/heads/branch_6x from Mikhail Khludnev
          [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=75a84b7 ]

          SOLR-8208: [subquery] document transformer executes separate requests per result document.

          Show
          jira-bot ASF subversion and git services added a comment - Commit 75a84b7d2d0d8f0ed36efad4306e5a938cae3a2a in lucene-solr's branch refs/heads/branch_6x from Mikhail Khludnev [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=75a84b7 ] SOLR-8208 : [subquery] document transformer executes separate requests per result document.
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          this commit adds TestSubQueryTransformerDistrib which took 1m 34s. Let me know if it's not affordable, and I need mark it somehow. It's a copy-cat twin bro of DistribJoinFromCollectionTest.

          Show
          mkhludnev Mikhail Khludnev added a comment - this commit adds TestSubQueryTransformerDistrib which took 1m 34s. Let me know if it's not affordable, and I need mark it somehow. It's a copy-cat twin bro of DistribJoinFromCollectionTest .
          Hide
          jira-bot ASF subversion and git services added a comment -

          Commit 184983280e9b47a24a448b1894b05bd97e221011 in lucene-solr's branch refs/heads/branch_6x from Mikhail Khludnev
          [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=1849832 ]

          SOLR-8208: miserable javadoc fixes

          Show
          jira-bot ASF subversion and git services added a comment - Commit 184983280e9b47a24a448b1894b05bd97e221011 in lucene-solr's branch refs/heads/branch_6x from Mikhail Khludnev [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=1849832 ] SOLR-8208 : miserable javadoc fixes
          Hide
          jira-bot ASF subversion and git services added a comment -

          Commit 470ba0794ecddd6375db3da521272dde46ed6761 in lucene-solr's branch refs/heads/master from Mikhail Khludnev
          [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=470ba07 ]

          SOLR-8208: miserable javadoc fixes

          Show
          jira-bot ASF subversion and git services added a comment - Commit 470ba0794ecddd6375db3da521272dde46ed6761 in lucene-solr's branch refs/heads/master from Mikhail Khludnev [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=470ba07 ] SOLR-8208 : miserable javadoc fixes
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          TLDR

          ok some notes about an attempt to use separate threads to invoke EmbeddedSolrServer.
          The first obstacle: it doesn't fully help to clean SolrRequestInfo. (and the problem is that EmbeddedSolrServer can't be invoked from Solr threads where SolrRequestInfo is set). So, even if we introduce thread pool to call EmbeddedSolrServer, SolrRequestInfo.clear() should be called before, because threadpool inherits SolrRequestInfo. Sic.

          The real problem comes later: when randomized tests run. EmbeddedSolrServer internals are larded with randomized flipping gears. So, when it requests a randomized context, it's obtained from a mapping. But the key of this map is a thread group, and DefaultSolrFactory assigns a thread group from the tread which load the threadpool, but it was a thread pool used to launch the first test in the suite, which usually finished already. That makes randomized context unavailable for tests request EmbeddedSolrServer running later. That's a bummer. No way. Don't do this ever.

          Show
          mkhludnev Mikhail Khludnev added a comment - TLDR ok some notes about an attempt to use separate threads to invoke EmbeddedSolrServer. The first obstacle: it doesn't fully help to clean SolrRequestInfo. (and the problem is that EmbeddedSolrServer can't be invoked from Solr threads where SolrRequestInfo is set). So, even if we introduce thread pool to call EmbeddedSolrServer, SolrRequestInfo.clear() should be called before, because threadpool inherits SolrRequestInfo. Sic. The real problem comes later: when randomized tests run. EmbeddedSolrServer internals are larded with randomized flipping gears. So, when it requests a randomized context, it's obtained from a mapping. But the key of this map is a thread group, and DefaultSolrFactory assigns a thread group from the tread which load the threadpool, but it was a thread pool used to launch the first test in the suite, which usually finished already. That makes randomized context unavailable for tests request EmbeddedSolrServer running later. That's a bummer. No way. Don't do this ever.
          Hide
          dsmiley David Smiley added a comment -

          ok some notes about an attempt to use separate threads to invoke EmbeddedSolrServer.

          Interesting. Nonetheless it seems that we shouldn't let our test infrastructure prevent us from what we want to do – it can be changed. This is an idealistic statement, maybe it isn't sufficiently worth-it, but it depends.

          Show
          dsmiley David Smiley added a comment - ok some notes about an attempt to use separate threads to invoke EmbeddedSolrServer. Interesting. Nonetheless it seems that we shouldn't let our test infrastructure prevent us from what we want to do – it can be changed. This is an idealistic statement, maybe it isn't sufficiently worth-it, but it depends.
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          I think that tests' hurdles can be tricked after all. Probably the more itching problem is the prohibition to call EmbeddedSolrServer from Solr threads. It's just not intended to do such recurrent requests. Let's address it in SOLR-9101

          Show
          mkhludnev Mikhail Khludnev added a comment - I think that tests' hurdles can be tricked after all. Probably the more itching problem is the prohibition to call EmbeddedSolrServer from Solr threads. It's just not intended to do such recurrent requests. Let's address it in SOLR-9101
          Hide
          steve_rowe Steve Rowe added a comment - - edited

          Here's a reproducible failure of TestSubQueryTransformerDistrib on my Jenkins:

          Checking out Revision 9d5b834b09d4ff23e89755e5d1af407a2bd96c16 (refs/remotes/origin/master)
          [...]
             [junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestSubQueryTransformerDistrib -Dtests.method=test -Dtests.seed=A6B6D43AC01C202D -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/lucene-data/enwiki.random.lines.txt -Dtests.locale=de-LU -Dtests.timezone=Pacific/Tongatapu -Dtests.asserts=true -Dtests.file.encoding=US-ASCII
             [junit4] ERROR   55.9s J7 | TestSubQueryTransformerDistrib.test <<<
             [junit4]    > Throwable #1: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:54181: Cannot create collection departments. Value of maxShardsPerNode is 12, and the number of nodes currently live or live and part of your createNodeSet is 5. This allows a maximum of 60 to be created. Value of numShards is 6 and value of replicationFactor is 12. This requires 72 shards to be created (higher than the allowed number)
             [junit4]    > 	at __randomizedtesting.SeedInfo.seed([A6B6D43AC01C202D:2EE2EBE06EE04DD5]:0)
             [junit4]    > 	at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:606)
             [junit4]    > 	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:259)
             [junit4]    > 	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:248)
             [junit4]    > 	at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1219)
             [junit4]    > 	at org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:1592)
             [junit4]    > 	at org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:1549)
             [junit4]    > 	at org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:1604)
             [junit4]    > 	at org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:1545)
             [junit4]    > 	at org.apache.solr.response.transform.TestSubQueryTransformerDistrib.test(TestSubQueryTransformerDistrib.java:64)
             [junit4]    > 	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:985)
             [junit4]    > 	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:960)
             [junit4]    > 	at java.lang.Thread.run(Thread.java:745)
             [junit4]   2> 743644 INFO  (SUITE-TestSubQueryTransformerDistrib-seed#[A6B6D43AC01C202D]-worker) [    ] o.a.s.SolrTestCaseJ4 ###deleteCore
             [junit4]   2> NOTE: leaving temporary files on disk at: /var/lib/jenkins/jobs/Lucene-Solr-Nightly-master/workspace/solr/build/solr-core/test/J7/temp/solr.response.transform.TestSubQueryTransformerDistrib_A6B6D43AC01C202D-001
             [junit4]   2> May 13, 2016 7:06:27 AM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
             [junit4]   2> WARNING: Will linger awaiting termination of 1 leaked thread(s).
             [junit4]   2> NOTE: test params are: codec=Asserting(Lucene62): {}, docValues:{}, maxPointsInLeafNode=772, maxMBSortInHeap=6.297414713628615, sim=RandomSimilarity(queryNorm=true,coord=no): {}, locale=de-LU, timezone=Pacific/Tongatapu
             [junit4]   2> NOTE: Linux 4.1.0-custom2-amd64 amd64/Oracle Corporation 1.8.0_77 (64-bit)/cpus=16,threads=1,free=267803120,total=527433728
             [junit4]   2> NOTE: All tests run in this JVM: [CoreMergeIndexesAdminHandlerTest, TestIBSimilarityFactory, AnalyticsMergeStrategyTest, SolrIndexSplitterTest, SolrPluginUtilsTest, DocumentBuilderTest, TestQueryTypes, BlockJoinFacetDistribTest, TestReplicationHandlerBackup, BadIndexSchemaTest, DistanceUnitsTest, CleanupOldIndexTest, OverseerRolesTest, DocValuesTest, DistributedFacetPivotSmallAdvancedTest, TestReloadAndDeleteDocs, TestSolrJ, TestPHPSerializedResponseWriter, TlogReplayBufferedWhileIndexingTest, TestSha256AuthenticationProvider, TestFaceting, DeleteStatusTest, TestSubQueryTransformerDistrib]
             [junit4] Completed [218/597 (3!)] on J7 in 56.17s, 1 test, 1 error <<< FAILURES!
          

          and another one from ASFJenkins a few days back https://builds.apache.org/job/Lucene-Solr-NightlyTests-master/1011/:

          Checking out Revision 470ba0794ecddd6375db3da521272dde46ed6761 (refs/remotes/origin/master)
          [...]
             [junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestSubQueryTransformerDistrib -Dtests.method=test -Dtests.seed=594D23296C3F97B8 -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/x1/jenkins/lucene-data/enwiki.random.lines.txt -Dtests.locale=es-US -Dtests.timezone=Etc/GMT-1 -Dtests.asserts=true -Dtests.file.encoding=US-ASCII
             [junit4] ERROR   59.8s J0 | TestSubQueryTransformerDistrib.test <<<
             [junit4]    > Throwable #1: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:43474/ldvl/gb: Cannot create collection departments. Value of maxShardsPerNode is 10, and the number of nodes currently live or live and part of your createNodeSet is 5. This allows a maximum of 50 to be created. Value of numShards is 6 and value of replicationFactor is 10. This requires 60 shards to be created (higher than the allowed number)
             [junit4]    > 	at __randomizedtesting.SeedInfo.seed([594D23296C3F97B8:D1191CF3C2C3FA40]:0)
             [junit4]    > 	at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:606)
             [junit4]    > 	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:259)
             [junit4]    > 	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:248)
             [junit4]    > 	at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1219)
             [junit4]    > 	at org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:1592)
             [junit4]    > 	at org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:1549)
             [junit4]    > 	at org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:1604)
             [junit4]    > 	at org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:1545)
             [junit4]    > 	at org.apache.solr.response.transform.TestSubQueryTransformerDistrib.test(TestSubQueryTransformerDistrib.java:64)
             [junit4]    > 	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:985)
             [junit4]    > 	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:960)
             [junit4]    > 	at java.lang.Thread.run(Thread.java:745)
             [junit4]   2> 4199324 INFO  (SUITE-TestSubQueryTransformerDistrib-seed#[594D23296C3F97B8]-worker) [    ] o.a.s.SolrTestCaseJ4 ###deleteCore
             [junit4]   2> NOTE: leaving temporary files on disk at: /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/solr/build/solr-core/test/J0/temp/solr.response.transform.TestSubQueryTransformerDistrib_594D23296C3F97B8-001
             [junit4]   2> May 11, 2016 5:31:49 AM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
             [junit4]   2> WARNING: Will linger awaiting termination of 1 leaked thread(s).
             [junit4]   2> NOTE: test params are: codec=Asserting(Lucene60): {}, docValues:{}, maxPointsInLeafNode=2044, maxMBSortInHeap=7.8998054031288465, sim=ClassicSimilarity, locale=es-US, timezone=Etc/GMT-1
             [junit4]   2> NOTE: Linux 3.13.0-85-generic amd64/Oracle Corporation 1.8.0_74 (64-bit)/cpus=4,threads=1,free=275515608,total=532676608
             [junit4]   2> NOTE: All tests run in this JVM: [TestExceedMaxTermLength, DistributedSpellCheckComponentTest, TestComponentsName, TestSha256AuthenticationProvider, TestGraphTermsQParserPlugin, AddSchemaFieldsUpdateProcessorFactoryTest, TestFunctionQuery, TestConfigSetImmutable, CursorPagingTest, XsltUpdateRequestHandlerTest, JsonLoaderTest, TestRawResponseWriter, TestPseudoReturnFields, PrimUtilsTest, LeaderElectionIntegrationTest, ParsingFieldUpdateProcessorsTest, TestInitParams, TestIBSimilarityFactory, TestCustomDocTransformer, SegmentsInfoRequestHandlerTest, TestTrackingShardHandlerFactory, PreAnalyzedFieldTest, CloneFieldUpdateProcessorFactoryTest, TestLRUStatsCache, CurrencyFieldOpenExchangeTest, TestLRUCache, TestConfigReload, DistributedQueryComponentOptimizationTest, TestConfigSetsAPIExclusivity, BlockDirectoryTest, JSONWriterTest, TestDistributedSearch, TolerantUpdateProcessorTest, TestLuceneMatchVersion, TestRandomFaceting, HdfsChaosMonkeyNothingIsSafeTest, RequiredFieldsTest, TestLMDirichletSimilarityFactory, TestUseDocValuesAsStored, RulesTest, TestRandomDVFaceting, PeerSyncTest, TestSSLRandomization, TestRangeQuery, TestOrdValues, SmileWriterTest, PathHierarchyTokenizerFactoryTest, LeaderInitiatedRecoveryOnCommitTest, TestSchemaVersionResource, TestCloudBackupRestore, TestCustomSort, SharedFSAutoReplicaFailoverTest, SolrRequestParserTest, SolrCoreTest, AlternateDirectoryTest, StandardRequestHandlerTest, CachingDirectoryFactoryTest, TestShardHandlerFactory, TestPhraseSuggestions, TestReplicationHandler, TestHighlightDedupGrouping, TestSolrQueryParserResource, DistributedQueryElevationComponentTest, CdcrVersionReplicationTest, TestConfig, DistribJoinFromCollectionTest, TestCSVLoader, ConcurrentDeleteAndCreateCollectionTest, SOLR749Test, OverseerTaskQueueTest, OutputWriterTest, TestJoin, AnalyticsQueryTest, ResponseLogComponentTest, TestRTGBase, ForceLeaderTest, TestJsonRequest, TestOverriddenPrefixQueryForCustomFieldType, IndexSchemaTest, DateRangeFieldTest, ActionThrottleTest, TestBinaryField, DeleteStatusTest, BasicZkTest, CdcrReplicationDistributedZkTest, IndexSchemaRuntimeFieldTest, BlockJoinFacetDistribTest, TestBadConfig, PolyFieldTest, TestDynamicFieldCollectionResource, SpatialRPTFieldTypeTest, CollectionTooManyReplicasTest, TestStressLucene, TestDistribDocBasedVersion, TestMergePolicyConfig, CoreMergeIndexesAdminHandlerTest, DistributedFacetPivotLargeTest, SSLMigrationTest, SaslZkACLProviderTest, ShardSplitTest, TestRequestForwarding, TestRequestStatusCollectionAPI, TestTolerantUpdateProcessorCloud, TlogReplayBufferedWhileIndexingTest, HdfsTlogReplayBufferedWhileIndexingTest, TestRestoreCore, DistributedDebugComponentTest, DistributedFacetPivotSmallTest, DistributedSuggestComponentTest, FacetPivotSmallTest, SpatialHeatmapFacetsTest, TestExpandComponent, WrapperMergePolicyFactoryTest, TestIntervalFaceting, TestGraphMLResponseWriter, TestSortingResponseWriter, TestChildDocTransformer, TestSubQueryTransformerDistrib]
             [junit4] Completed [549/597 (8!)] on J0 in 60.44s, 1 test, 1 error <<< FAILURES!
          
          Show
          steve_rowe Steve Rowe added a comment - - edited Here's a reproducible failure of TestSubQueryTransformerDistrib on my Jenkins: Checking out Revision 9d5b834b09d4ff23e89755e5d1af407a2bd96c16 (refs/remotes/origin/master) [...] [junit4] 2> NOTE: reproduce with: ant test -Dtestcase=TestSubQueryTransformerDistrib -Dtests.method=test -Dtests.seed=A6B6D43AC01C202D -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/home/jenkins/lucene-data/enwiki.random.lines.txt -Dtests.locale=de-LU -Dtests.timezone=Pacific/Tongatapu -Dtests.asserts=true -Dtests.file.encoding=US-ASCII [junit4] ERROR 55.9s J7 | TestSubQueryTransformerDistrib.test <<< [junit4] > Throwable #1: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:54181: Cannot create collection departments. Value of maxShardsPerNode is 12, and the number of nodes currently live or live and part of your createNodeSet is 5. This allows a maximum of 60 to be created. Value of numShards is 6 and value of replicationFactor is 12. This requires 72 shards to be created (higher than the allowed number) [junit4] > at __randomizedtesting.SeedInfo.seed([A6B6D43AC01C202D:2EE2EBE06EE04DD5]:0) [junit4] > at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:606) [junit4] > at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:259) [junit4] > at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:248) [junit4] > at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1219) [junit4] > at org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:1592) [junit4] > at org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:1549) [junit4] > at org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:1604) [junit4] > at org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:1545) [junit4] > at org.apache.solr.response.transform.TestSubQueryTransformerDistrib.test(TestSubQueryTransformerDistrib.java:64) [junit4] > at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:985) [junit4] > at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:960) [junit4] > at java.lang.Thread.run(Thread.java:745) [junit4] 2> 743644 INFO (SUITE-TestSubQueryTransformerDistrib-seed#[A6B6D43AC01C202D]-worker) [ ] o.a.s.SolrTestCaseJ4 ###deleteCore [junit4] 2> NOTE: leaving temporary files on disk at: /var/lib/jenkins/jobs/Lucene-Solr-Nightly-master/workspace/solr/build/solr-core/test/J7/temp/solr.response.transform.TestSubQueryTransformerDistrib_A6B6D43AC01C202D-001 [junit4] 2> May 13, 2016 7:06:27 AM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks [junit4] 2> WARNING: Will linger awaiting termination of 1 leaked thread(s). [junit4] 2> NOTE: test params are: codec=Asserting(Lucene62): {}, docValues:{}, maxPointsInLeafNode=772, maxMBSortInHeap=6.297414713628615, sim=RandomSimilarity(queryNorm=true,coord=no): {}, locale=de-LU, timezone=Pacific/Tongatapu [junit4] 2> NOTE: Linux 4.1.0-custom2-amd64 amd64/Oracle Corporation 1.8.0_77 (64-bit)/cpus=16,threads=1,free=267803120,total=527433728 [junit4] 2> NOTE: All tests run in this JVM: [CoreMergeIndexesAdminHandlerTest, TestIBSimilarityFactory, AnalyticsMergeStrategyTest, SolrIndexSplitterTest, SolrPluginUtilsTest, DocumentBuilderTest, TestQueryTypes, BlockJoinFacetDistribTest, TestReplicationHandlerBackup, BadIndexSchemaTest, DistanceUnitsTest, CleanupOldIndexTest, OverseerRolesTest, DocValuesTest, DistributedFacetPivotSmallAdvancedTest, TestReloadAndDeleteDocs, TestSolrJ, TestPHPSerializedResponseWriter, TlogReplayBufferedWhileIndexingTest, TestSha256AuthenticationProvider, TestFaceting, DeleteStatusTest, TestSubQueryTransformerDistrib] [junit4] Completed [218/597 (3!)] on J7 in 56.17s, 1 test, 1 error <<< FAILURES! and another one from ASFJenkins a few days back https://builds.apache.org/job/Lucene-Solr-NightlyTests-master/1011/ : Checking out Revision 470ba0794ecddd6375db3da521272dde46ed6761 (refs/remotes/origin/master) [...] [junit4] 2> NOTE: reproduce with: ant test -Dtestcase=TestSubQueryTransformerDistrib -Dtests.method=test -Dtests.seed=594D23296C3F97B8 -Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true -Dtests.linedocsfile=/x1/jenkins/lucene-data/enwiki.random.lines.txt -Dtests.locale=es-US -Dtests.timezone=Etc/GMT-1 -Dtests.asserts=true -Dtests.file.encoding=US-ASCII [junit4] ERROR 59.8s J0 | TestSubQueryTransformerDistrib.test <<< [junit4] > Throwable #1: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:43474/ldvl/gb: Cannot create collection departments. Value of maxShardsPerNode is 10, and the number of nodes currently live or live and part of your createNodeSet is 5. This allows a maximum of 50 to be created. Value of numShards is 6 and value of replicationFactor is 10. This requires 60 shards to be created (higher than the allowed number) [junit4] > at __randomizedtesting.SeedInfo.seed([594D23296C3F97B8:D1191CF3C2C3FA40]:0) [junit4] > at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:606) [junit4] > at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:259) [junit4] > at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:248) [junit4] > at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1219) [junit4] > at org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:1592) [junit4] > at org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:1549) [junit4] > at org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:1604) [junit4] > at org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:1545) [junit4] > at org.apache.solr.response.transform.TestSubQueryTransformerDistrib.test(TestSubQueryTransformerDistrib.java:64) [junit4] > at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:985) [junit4] > at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:960) [junit4] > at java.lang.Thread.run(Thread.java:745) [junit4] 2> 4199324 INFO (SUITE-TestSubQueryTransformerDistrib-seed#[594D23296C3F97B8]-worker) [ ] o.a.s.SolrTestCaseJ4 ###deleteCore [junit4] 2> NOTE: leaving temporary files on disk at: /x1/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/solr/build/solr-core/test/J0/temp/solr.response.transform.TestSubQueryTransformerDistrib_594D23296C3F97B8-001 [junit4] 2> May 11, 2016 5:31:49 AM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks [junit4] 2> WARNING: Will linger awaiting termination of 1 leaked thread(s). [junit4] 2> NOTE: test params are: codec=Asserting(Lucene60): {}, docValues:{}, maxPointsInLeafNode=2044, maxMBSortInHeap=7.8998054031288465, sim=ClassicSimilarity, locale=es-US, timezone=Etc/GMT-1 [junit4] 2> NOTE: Linux 3.13.0-85-generic amd64/Oracle Corporation 1.8.0_74 (64-bit)/cpus=4,threads=1,free=275515608,total=532676608 [junit4] 2> NOTE: All tests run in this JVM: [TestExceedMaxTermLength, DistributedSpellCheckComponentTest, TestComponentsName, TestSha256AuthenticationProvider, TestGraphTermsQParserPlugin, AddSchemaFieldsUpdateProcessorFactoryTest, TestFunctionQuery, TestConfigSetImmutable, CursorPagingTest, XsltUpdateRequestHandlerTest, JsonLoaderTest, TestRawResponseWriter, TestPseudoReturnFields, PrimUtilsTest, LeaderElectionIntegrationTest, ParsingFieldUpdateProcessorsTest, TestInitParams, TestIBSimilarityFactory, TestCustomDocTransformer, SegmentsInfoRequestHandlerTest, TestTrackingShardHandlerFactory, PreAnalyzedFieldTest, CloneFieldUpdateProcessorFactoryTest, TestLRUStatsCache, CurrencyFieldOpenExchangeTest, TestLRUCache, TestConfigReload, DistributedQueryComponentOptimizationTest, TestConfigSetsAPIExclusivity, BlockDirectoryTest, JSONWriterTest, TestDistributedSearch, TolerantUpdateProcessorTest, TestLuceneMatchVersion, TestRandomFaceting, HdfsChaosMonkeyNothingIsSafeTest, RequiredFieldsTest, TestLMDirichletSimilarityFactory, TestUseDocValuesAsStored, RulesTest, TestRandomDVFaceting, PeerSyncTest, TestSSLRandomization, TestRangeQuery, TestOrdValues, SmileWriterTest, PathHierarchyTokenizerFactoryTest, LeaderInitiatedRecoveryOnCommitTest, TestSchemaVersionResource, TestCloudBackupRestore, TestCustomSort, SharedFSAutoReplicaFailoverTest, SolrRequestParserTest, SolrCoreTest, AlternateDirectoryTest, StandardRequestHandlerTest, CachingDirectoryFactoryTest, TestShardHandlerFactory, TestPhraseSuggestions, TestReplicationHandler, TestHighlightDedupGrouping, TestSolrQueryParserResource, DistributedQueryElevationComponentTest, CdcrVersionReplicationTest, TestConfig, DistribJoinFromCollectionTest, TestCSVLoader, ConcurrentDeleteAndCreateCollectionTest, SOLR749Test, OverseerTaskQueueTest, OutputWriterTest, TestJoin, AnalyticsQueryTest, ResponseLogComponentTest, TestRTGBase, ForceLeaderTest, TestJsonRequest, TestOverriddenPrefixQueryForCustomFieldType, IndexSchemaTest, DateRangeFieldTest, ActionThrottleTest, TestBinaryField, DeleteStatusTest, BasicZkTest, CdcrReplicationDistributedZkTest, IndexSchemaRuntimeFieldTest, BlockJoinFacetDistribTest, TestBadConfig, PolyFieldTest, TestDynamicFieldCollectionResource, SpatialRPTFieldTypeTest, CollectionTooManyReplicasTest, TestStressLucene, TestDistribDocBasedVersion, TestMergePolicyConfig, CoreMergeIndexesAdminHandlerTest, DistributedFacetPivotLargeTest, SSLMigrationTest, SaslZkACLProviderTest, ShardSplitTest, TestRequestForwarding, TestRequestStatusCollectionAPI, TestTolerantUpdateProcessorCloud, TlogReplayBufferedWhileIndexingTest, HdfsTlogReplayBufferedWhileIndexingTest, TestRestoreCore, DistributedDebugComponentTest, DistributedFacetPivotSmallTest, DistributedSuggestComponentTest, FacetPivotSmallTest, SpatialHeatmapFacetsTest, TestExpandComponent, WrapperMergePolicyFactoryTest, TestIntervalFaceting, TestGraphMLResponseWriter, TestSortingResponseWriter, TestChildDocTransformer, TestSubQueryTransformerDistrib] [junit4] Completed [549/597 (8!)] on J0 in 60.44s, 1 test, 1 error <<< FAILURES!
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          I.ll fix it in a few hours. Is there a quick hint, how to ? I remember something about setting numShards.. But appreciate a suggestion.

          Show
          mkhludnev Mikhail Khludnev added a comment - I.ll fix it in a few hours. Is there a quick hint, how to ? I remember something about setting numShards.. But appreciate a suggestion.
          Hide
          mkhludnev Mikhail Khludnev added a comment -

          seting resonable numbers for createCollection(people, 2, 1, 10);
          SOLR-8208-distrib-test-fix.patch

          Show
          mkhludnev Mikhail Khludnev added a comment - seting resonable numbers for createCollection(people, 2, 1, 10); SOLR-8208-distrib-test-fix.patch
          Hide
          jira-bot ASF subversion and git services added a comment -

          Commit 3b0a79a13ee77c867576edcfb82477ee0ea65db6 in lucene-solr's branch refs/heads/master from Mikhail Khludnev
          [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=3b0a79a ]

          SOLR-8208: fixing TestSubQueryTransformerDistrib by passing reasonable numbers in creatCollection()

          Show
          jira-bot ASF subversion and git services added a comment - Commit 3b0a79a13ee77c867576edcfb82477ee0ea65db6 in lucene-solr's branch refs/heads/master from Mikhail Khludnev [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=3b0a79a ] SOLR-8208 : fixing TestSubQueryTransformerDistrib by passing reasonable numbers in creatCollection()
          Hide
          jira-bot ASF subversion and git services added a comment -

          Commit e94ffde44e654b9b68a7d3a9d37db3fb0f66ba20 in lucene-solr's branch refs/heads/branch_6x from Mikhail Khludnev
          [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=e94ffde ]

          SOLR-8208: fixing TestSubQueryTransformerDistrib by passing reasonable numbers in creatCollection()

          Show
          jira-bot ASF subversion and git services added a comment - Commit e94ffde44e654b9b68a7d3a9d37db3fb0f66ba20 in lucene-solr's branch refs/heads/branch_6x from Mikhail Khludnev [ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=e94ffde ] SOLR-8208 : fixing TestSubQueryTransformerDistrib by passing reasonable numbers in creatCollection()

            People

            • Assignee:
              mkhludnev Mikhail Khludnev
              Reporter:
              mkhludnev Mikhail Khludnev
            • Votes:
              0 Vote for this issue
              Watchers:
              12 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development