Uploaded image for project: 'Solr'
  1. Solr
  2. SOLR-11872

Refactor test infra to work with a managed SolrClient; ditch TestHarness

Details

    • Improvement
    • Status: Open
    • Major
    • Resolution: Unresolved
    • None
    • None
    • Tests

    Description

      This is a proposal to substantially refactor SolrTestCaseJ4 and some of its intermediate subclasses in the hierarchy.  In essence, I envision that tests should work with a SolrClient typed "solrClient" field managed by the test infrastructure. With only a few lines of code, a test should be able to pick between an instance based on EmbeddedSolrServer (lighter tests), HttpSolrClient (tests HTTP/Jetty behavior directly or indirectly), SolrCloud, and perhaps a special one for our distributed search tests. STCJ4 would refactor its methods to use the solrClient field instead of TestHarness. TestHarness would disappear as-such; bits of its existing code would migrate elsewhere, such as to manage an EmbeddedSolrServer for testing.

      I think we can do a transition like this in stages and furthermore minimally affecting most tests by adding some deprecated shims. Perhaps STCJ4 should become the deprecated shim so that users can still use it during 7.x and to help us with the transition internally too. More specifically, we'd add a new superclass to STCJ4 that is the future – "SolrTestCase".

      Additionally, there are a bunch of methods on SolrTestCaseJ4 that I question the design of, especially ones that return XML strings like delI (generates a delete-by-id XML string) and adoc. Perhaps that used to be a fine idea before there was a convenient SolrClient API but we've got one now and a test shouldn't be building XML unless it's trying to test exactly that.

      For consulting work I once developed a JUnit4 TestRule managing a SolrClient that is declared in a test with an annotation of @ClassRule. I had a variation for SolrCloud and EmbeddedSolrServer that was easy for a test to choose. Since TestRule is an interface, I was able to make a special delegating SolrClient subclass that implements TestRule. This isn't essential but makes use of it easier since otherwise you'd be forced to call something like getSolrClient(). We could go the TestRule route here, which I prefer (with or without having it subclass SolrClient), or we could alternatively do TestCase subclassing to manage the lifecycle.

      Initially I'm just looking for agreement and refinement of the approach. After that, sub-tasks ought to be added. I won't have time to work on this for some time.

      Attachments

        Issue Links

          There are no Sub-Tasks for this issue.

          Activity

            gus Gus Heck added a comment -

            Refactor and document... How to use the new refactored version should be covered in the Ref Guide. Presently I wind up reading other tests to find out how things work, and very often copy/paste/modify other tests to get going. It would be nice if one could refer to documentation that pointed out the options available and gave some advice on when to use what.

             a test shouldn't be building XML unless it's trying to test exactly that.

            Yes, Read/Write of any writer type should be thoroughly tested in a single place and then presumed to work elsewhere, so it shouldn't matter which is used by the test (could be randomized?).

            Also I think it should ideally be easy to flip between:

            1. Using solrj API and letting it do whatever it happens to prefer (the most common case)
            2. explicitly sending V1 style requests 
            3. explicitly sending V2 style requests

            How ever we do things, it has to be easy for the test writer to see/understand that they HAVE exercised both API's, and to get printouts of exactly what was sent as a post-body, headers, request line etc... For testing API command implementations it's pretty key to make sure all the bases are covered, and that the v1/v2 conversions and property substitutions are happening properly behind the scenes. Other usages possibly don't care as much... which is something to think about too.

            Also, it seems that there are about 30 tests that turn off SSL to avoid random keystore errors... that also needs to be fixed, though probably a separate issue.

            gus Gus Heck added a comment - Refactor and document... How to use the new refactored version should be covered in the Ref Guide. Presently I wind up reading other tests to find out how things work, and very often copy/paste/modify other tests to get going. It would be nice if one could refer to documentation that pointed out the options available and gave some advice on when to use what.  a test shouldn't be building XML unless it's trying to test exactly that. Yes, Read/Write of any writer type should be thoroughly tested in a single place and then presumed to work elsewhere, so it shouldn't matter which is used by the test (could be randomized?). Also I think it should ideally be easy to flip between: Using solrj API and letting it do whatever it happens to prefer (the most common case) explicitly sending V1 style requests  explicitly sending V2 style requests How ever we do things, it has to be easy for the test writer to see/understand that they HAVE exercised both API's, and to get printouts of exactly what was sent as a post-body, headers, request line etc... For testing API command implementations it's pretty key to make sure all the bases are covered, and that the v1/v2 conversions and property substitutions are happening properly behind the scenes. Other usages possibly don't care as much... which is something to think about too. Also, it seems that there are about 30 tests that turn off SSL to avoid random keystore errors... that also needs to be fixed, though probably a separate issue.
            dsmiley David Smiley added a comment -

            Not to take away from your points about v1/v2 but I think that's a separate issue then the scope in the description, which will be a lot to deal with as it is.

            Good point about documentation. I also search for existing tests to know what to do; there really isn't any other way at the moment. It can be hard to keep documentation up to date. One way that I think worked well in Lucene-spatial-extras is to have a particular real test case that is especially well documented. Then link to this test-case in various places. Perhaps this would be a sub-task of this issue.

            dsmiley David Smiley added a comment - Not to take away from your points about v1/v2 but I think that's a separate issue then the scope in the description, which will be a lot to deal with as it is. Good point about documentation. I also search for existing tests to know what to do; there really isn't any other way at the moment. It can be hard to keep documentation up to date. One way that I think worked well in Lucene-spatial-extras is to have a particular real test case that is especially well documented. Then link to this test-case in various places. Perhaps this would be a sub-task of this issue.

            With only a few lines of code, a test should be able to pick between an instance based on EmbeddedSolrServer (lighter tests), HttpSolrClient (tests HTTP/Jetty behavior directly or indirectly), SolrCloud,

            I'd go even farther and say Ideally it should be possible for a test to say "using confgiset=XXX, give me a SolrClient, i don't care what type" for tests that want to verify that the updates/queries/asserts they run behave consistently regardless of whether they are used against a single node/core, or a cloud cluster with that collection using 1 shard, or 10 shards.

            Having that, and having explicit asserts of the expected behavior (regardless of shard count) would be vastly superior to a lot of our existing "distributed" tests that just assert the response bodies of 2 identical queries done in paralell to single/multi-node solr instance "match" – w/o ever verifying that those responses contain anything of substance (let alone the correct substance)

            Additionally, there are a bunch of methods on SolrTestCaseJ4 that I question the design of, especially ones that return XML strings like delI (generates a delete-by-id XML string) and adoc. Perhaps that used to be a fine idea before there was a convenient SolrClient API but we've got one now and a test shouldn't be building XML unless it's trying to test exactly that.

            I can confirm i wrote all of those methods way, way, way back in the Solr 1.0 days when XML was the only way to talk to Solr.  And while I generally agree with you on this point (and would love to see methods like delI, delQ, & adoc be phased out in favor of "short hand macros" that do similar things but return actual UpdateRequest objects for passing to a SolrClient) I would like to point out that one advantage of things like assertQ and the use of xpaths is that I find it helps me write assertions much less verbosely then when i write Cloud based tests of similar functionality – the trade off being that then you have to know xpath, and for people who don't intimately understand the syntax, that brevity can be a hinderance to understanding/maintaining those tests.

            I guess my point is that as we look towards ways to re-write tests using "real" SolrJ supported SolrClient based APIs, and start looking at what some "real" converted tests would actually look like, it would be worth while to constantly keep an eye out for what kinds of helper functionality we might want to include in our tests base classes to reduce the brevity needed when drilling down into response documents to assert expected results. (But alas, i don't have any concrete suggestions on what that might look like at this point)

             


            Refactor and document... How to use the new refactored version should be covered in the Ref Guide. ...

            Whoa... I strongly disagree with this idea: the ref guide is for end users of solr – not java developers maintaining Solr tests or attempting to subclass test framework base classes to write their own unit tests. This is exactly what javadocs are for. If something doesn't seem like a good fit for a specific classes javadoc, it should got in the package javadocs. (Or worst case scenerio in ./doc-files w/ a link from the relevant package/class/method javadocs.)

            hossman Chris M. Hostetter added a comment - With only a few lines of code, a test should be able to pick between an instance based on EmbeddedSolrServer (lighter tests), HttpSolrClient (tests HTTP/Jetty behavior directly or indirectly), SolrCloud, I'd go even farther and say Ideally it should be possible for a test to say "using confgiset=XXX, give me a SolrClient , i don't care what type" for tests that want to verify that the updates/queries/asserts they run behave consistently regardless of whether they are used against a single node/core, or a cloud cluster with that collection using 1 shard, or 10 shards. Having that, and having explicit asserts of the expected behavior (regardless of shard count) would be vastly superior to a lot of our existing "distributed" tests that just assert the response bodies of 2 identical queries done in paralell to single/multi-node solr instance "match" – w/o ever verifying that those responses contain anything of substance (let alone the correct substance) Additionally, there are a bunch of methods on SolrTestCaseJ4 that I question the design of, especially ones that return XML strings like delI (generates a delete-by-id XML string) and adoc. Perhaps that used to be a fine idea before there was a convenient SolrClient API but we've got one now and a test shouldn't be building XML unless it's trying to test exactly that. I can confirm i wrote all of those methods way, way, way back in the Solr 1.0 days when XML was the only way to talk to Solr.  And while I generally agree with you on this point (and would love to see methods like delI, delQ, & adoc be phased out in favor of "short hand macros" that do similar things but return actual UpdateRequest objects for passing to a SolrClient) I would like to point out that one advantage of things like assertQ and the use of xpaths is that I find it helps me write assertions much less verbosely then when i write Cloud based tests of similar functionality – the trade off being that then you have to know xpath, and for people who  don't intimately understand the syntax, that brevity can be a hinderance to understanding/maintaining those tests. I guess my point is that as we look towards ways to re-write tests using "real" SolrJ supported SolrClient based APIs, and start looking at what some "real" converted tests would actually look like, it would be worth while to constantly keep an eye out for what kinds of helper functionality we might want to include in our tests base classes to reduce the brevity needed when drilling down into response documents to assert expected results. (But alas, i don't have any concrete suggestions on what that might look like at this point)   Refactor and document... How to use the new refactored version should be covered in the Ref Guide. ... Whoa... I strongly disagree with this idea: the ref guide is for end users of solr – not java developers maintaining Solr tests or attempting to subclass test framework base classes to write their own unit tests. This is exactly what javadocs are for. If something doesn't seem like a good fit for a specific classes javadoc, it should got in the package javadocs. (Or worst case scenerio in ./doc-files w/ a link from the relevant package/class/method javadocs.)
            gus Gus Heck added a comment -

            Whoa... I strongly disagree with this idea: the ref guide is for end users of solr

            People who write solrj code that interacts with solr sometimes use these test classes to power their own tests (running against their schema/etc)... I think our end users are almost always developers of one level or another. How to quickly and easily write a unit test against an embedded solr (which class to use, what to do and not do, C/P starter example etc) is something I think a segment end users who use solrj might be happy to find in the ref guide, but sure better javadocs would be a great too. Personally for projects I do, I often pursue a more strictly unit test / mock object style, with fewer of these sorts of integrated tests (using embeded servers), but I know there are lots of folks who just love embedded server tests (for a variety of servers, not just solr).

             

            gus Gus Heck added a comment - Whoa...  I strongly disagree with this idea: the ref guide is for  end users  of solr People who write solrj code that interacts with solr sometimes use these test classes to power their own tests (running against their schema/etc)... I think our end users are almost always developers of one level or another. How to quickly and easily write a unit test against an embedded solr (which class to use, what to do and not do, C/P starter example etc) is something I think a segment end users who use solrj might be happy to find in the ref guide, but sure better javadocs would be a great too. Personally for projects I do, I often pursue a more strictly unit test / mock object style, with fewer of these sorts of integrated tests (using embeded servers), but I know there are lots of folks who just love embedded server tests (for a variety of servers, not just solr).  
            erickerickson Erick Erickson added a comment -

            bq: I think our end users are almost always developers of one level or another.

            This is not true in my experience. I interact with a lot of organizations where they're using Solr without writing a single line of Java code. And even in the organizations where there are some devs, after they get things working the project is often thrown over the wall to the operations people who want to do things like create collections, add replicas, change the schema and re-index, troubleshoot performance, troubleshoot why queries aren't returning expected results etc.. You know, keep Solr actually running day in and day out. All without necessarily knowing how to read Java, much less write unit tests.

            Now that said, for a dev to try to start developing Solr code, writing plugins, whatever is more difficult than it needs to be. "Dive in to the code and figure it out" is the advice I often have to give. "Start with a unit test" is another way to go. It would make a lot of sense to have a "Developer's Guide" aimed at, well, developers in addition to the end-user Reference Guide. That's where this kind of documentation should be done, not intermixed with the rest of the Reference Guide.

            Organizationally I don't particularly care if we have two separate guides or they're two clearly separate halves of a single large. I do care that we don't conflate the two; these are two very different audiences. I'm pretty sure there's content in the Reference Guide that would be a better fit in a Developer's Guide as well.

            erickerickson Erick Erickson added a comment - bq: I think our end users are almost always developers of one level or another. This is not true in my experience. I interact with a lot of organizations where they're using Solr without writing a single line of Java code. And even in the organizations where there are some devs, after they get things working the project is often thrown over the wall to the operations people who want to do things like create collections, add replicas, change the schema and re-index, troubleshoot performance, troubleshoot why queries aren't returning expected results etc.. You know, keep Solr actually running day in and day out. All without necessarily knowing how to read Java, much less write unit tests. Now that said, for a dev to try to start developing Solr code, writing plugins, whatever is more difficult than it needs to be. "Dive in to the code and figure it out" is the advice I often have to give. "Start with a unit test" is another way to go. It would make a lot of sense to have a "Developer's Guide" aimed at, well, developers in addition to the end-user Reference Guide. That's where this kind of documentation should be done, not intermixed with the rest of the Reference Guide. Organizationally I don't particularly care if we have two separate guides or they're two clearly separate halves of a single large. I do care that we don't conflate the two; these are two very different audiences. I'm pretty sure there's content in the Reference Guide that would be a better fit in a Developer's Guide as well.
            gus Gus Heck added a comment -

            Ok, that's fair, my perspective is probably skewed by things I have been involved in. I did have one client who was using PHP for indexing, but mostly my work has been with teams of java folks, or with me as "the guy". I don't care much if it's separate or segregated, I just think it ought to exist at a level beyond javadocs, somewhere where examples are easy to read/maintain and people don't have to dig to find it. Package level javadocs, while a good idea in theory are so rarely done well that I hardly ever look at them. 90% of the time it's just blank anyway, so often I forget they even exist.

            gus Gus Heck added a comment - Ok, that's fair, my perspective is probably skewed by things I have been involved in. I did have one client who was using PHP for indexing, but mostly my work has been with teams of java folks, or with me as "the guy". I don't care much if it's separate or segregated, I just think it ought to exist at a level beyond javadocs, somewhere where examples are easy to read/maintain and people don't have to dig to find it. Package level javadocs, while a good idea in theory are so rarely done well that I hardly ever look at them. 90% of the time it's just blank anyway, so often I forget they even exist.
            noble.paul Noble Paul added a comment -

            Is there a way to split this into smaller tasks ?

            Ideally, we should write everything against an API. I mean no direct xml, json etc. That means most of the tests will need to be rewritten. This is a huge undertaking. We will have to collaborate and divide the work. I'll be happy to pitch in.

            noble.paul Noble Paul added a comment - Is there a way to split this into smaller tasks ? Ideally, we should write everything against an API. I mean no direct xml, json etc. That means most of the tests will need to be rewritten. This is a huge undertaking. We will have to collaborate and divide the work. I'll be happy to pitch in.
            noble.paul Noble Paul added a comment -

            what some "real" converted tests would actually look like

            an example here

            This test uses standard solrj apis and use a "json path syntax" to validate the response. It currently works for both JSON/javabin. I'm not saying it is perfect , but it is one example that came to my mind

            noble.paul Noble Paul added a comment - what some "real" converted tests would actually look like an example here This test uses standard solrj apis and use a "json path syntax" to validate the response. It currently works for both JSON/javabin. I'm not saying it is perfect , but it is one example that came to my mind
            dsmiley David Smiley added a comment -

            You refer to assertResponseValues specifically, added by you very recently?  Yeah that looks good – something like assertJQ but which takes a SolrResponseBase instead of XML, and it appears you have done that.

            I'm very glad you are interested in helping noble.paul and yes I think we can split this up some.  I've got an important trip to prepare for that will consume my time for the next 7 days or so but after I can take a stab at outlining some tasks and their scopes for discussion here.

            dsmiley David Smiley added a comment - You refer to assertResponseValues specifically, added by you very recently?  Yeah that looks good – something like assertJQ but which takes a SolrResponseBase instead of XML, and it appears you have done that. I'm very glad you are interested in helping noble.paul and yes I think we can split this up some.  I've got an important trip to prepare for that will consume my time for the next 7 days or so but after I can take a stab at outlining some tasks and their scopes for discussion here.
            dsmiley David Smiley added a comment -

            I think there are basically two tasks:

            (A) SolrProvider test utility.  It provides SolrClient instances, and will take care to close them when tests end.  The biggest part of this is that it provides mechanisms to choose what sort of backing Solr: Standalone embedded, Standalone HTTP embedded, SolrCloud embedded, and perhaps eventually (not now) "external" that could be used for e.g. docker or bin/solr.).  The test case would choose a specific type if it really cares but otherwise the choice is by system property or finally randomly but biased to fast standalone embedded.  TBD wether this SolrProvider is a JUnit Rule but I'd prefer to start this way.  Alternatively, SolrTestCase in particular would provide methods for it, similar to how SolrCloudTestCase offers a Builder and some utility methods.  Main thing is that it's opt-in; some simple unit tests don't even need Solr.  This functionality may borrow ideas & code from SolrCloudTestCase, SolrJettyTestBase, and EmbeddedSolrServerTestBase.  Eventually those classes should change to use this SolrProvider.  I personally have conceptually done this task in some shape or form twice including just recently, thus I elect to do this task now.

            (B) Move the "good parts" of SolrTestCaseJ4 (STCJ4) to SolrTestCase (STC), it's superclass, and consider STCJ4 deprecated. The "bad parts" of STCJ4 is the TestHarness instance and thus all the methods that indirectly touch it. And all the XML specific methods. I don't see the point of what we have today wherein we have STC and STCJ4 thus this plan involves migrating to STC as the new common base class – the one we've had all along but barely knew it . Changes should be small enough such that 3rd party solr-test-framework users won't have to change in 8.x. Intermediate base classes (e.g. SolrCloudTestCase) should migrate over to subclass STC. Ideally as many subclasses of STCJ4 should switch to STC as is easy to do so.

            It'd be nice if (A) is done first but it doesn't strictly matter. Once (A) is done, the reach of (B) can extend to more tests. For example, EmbeddedSolrServerTestBase must stick with STCJ4 until it has a SolrProvider to switch to.

            Any opinions here or shall I commence? I'm itching to start on a SolrProvider. Obviously I'll show a PR to show how it's used, perhaps by a few converted tests to show the gist.

            dsmiley David Smiley added a comment - I think there are basically two tasks: (A) SolrProvider test utility.  It provides SolrClient instances, and will take care to close them when tests end.  The biggest part of this is that it provides mechanisms to choose what sort of backing Solr: Standalone embedded, Standalone HTTP embedded, SolrCloud embedded, and perhaps eventually (not now) "external" that could be used for e.g. docker or bin/solr.).  The test case would choose a specific type if it really cares but otherwise the choice is by system property or finally randomly but biased to fast standalone embedded.  TBD wether this SolrProvider is a JUnit Rule but I'd prefer to start this way.  Alternatively, SolrTestCase in particular would provide methods for it, similar to how SolrCloudTestCase offers a Builder and some utility methods.  Main thing is that it's opt-in; some simple unit tests don't even need Solr.  This functionality may borrow ideas & code from SolrCloudTestCase, SolrJettyTestBase, and EmbeddedSolrServerTestBase.  Eventually those classes should change to use this SolrProvider.  I personally have conceptually done this task in some shape or form twice including just recently, thus I elect to do this task now. (B) Move the "good parts" of SolrTestCaseJ4 (STCJ4) to SolrTestCase (STC), it's superclass, and consider STCJ4 deprecated. The "bad parts" of STCJ4 is the TestHarness instance and thus all the methods that indirectly touch it. And all the XML specific methods. I don't see the point of what we have today wherein we have STC and STCJ4 thus this plan involves migrating to STC as the new common base class – the one we've had all along but barely knew it . Changes should be small enough such that 3rd party solr-test-framework users won't have to change in 8.x. Intermediate base classes (e.g. SolrCloudTestCase) should migrate over to subclass STC. Ideally as many subclasses of STCJ4 should switch to STC as is easy to do so. It'd be nice if (A) is done first but it doesn't strictly matter. Once (A) is done, the reach of (B) can extend to more tests. For example, EmbeddedSolrServerTestBase must stick with STCJ4 until it has a SolrProvider to switch to. Any opinions here or shall I commence? I'm itching to start on a SolrProvider. Obviously I'll show a PR to show how it's used, perhaps by a few converted tests to show the gist.

            Sounds like a reasonable plan of attack to me. I'm excited to see what this will look like.

            gerlowskija Jason Gerlowski added a comment - Sounds like a reasonable plan of attack to me. I'm excited to see what this will look like.
            dsmiley David Smiley added a comment -

            I'm hoping to get the interest of a GSoC student...

            GSoC 2021 Readme

            Apache Solr is a popular search engine, and is an old and successful project. The foundation of the test infrastructure was written in a way that we'd rather not keep, with the benefit of years of hindsight. In particular, it was written before there was a SolrClient abstraction. The project has a plan to revamp the tests to use the SolrClient abstraction and thus make the test code look more like real Solr client-side code, and also enable tests to run in a variety of modes (standalone Solr, SolrCloud, Docker-ized Solr). Some design and judgement will be needed to explore some of the particulars that are TBD, and then a lot of work to update many of the tests. There are some follow-on / stretch goals: One is ensuring that there's a Docker based test mode that can run from the Gradle build and that which would run in CI. Another is exploring how tests might easily define configuration aspects in the code instead of polluting more and more test configuration files that are difficult to maintain. This revolution in test infrastructure is highly appreciated work by not just Solr developers but anyone who writes Solr plugins that use the framework!

            Difficulty: Medium
            – The most difficult aspect is in digesting a large codebase, even if mostly limited to the test aspect.  The actual work/code is probably not difficult.  There will be many tests to potentially update.
            Language: Java
            Mentor: David Smiley

            dsmiley David Smiley added a comment - I'm hoping to get the interest of a GSoC student... GSoC 2021 Readme Apache Solr is a popular search engine, and is an old and successful project. The foundation of the test infrastructure was written in a way that we'd rather not keep, with the benefit of years of hindsight. In particular, it was written before there was a SolrClient abstraction. The project has a plan to revamp the tests to use the SolrClient abstraction and thus make the test code look more like real Solr client-side code, and also enable tests to run in a variety of modes (standalone Solr, SolrCloud, Docker-ized Solr). Some design and judgement will be needed to explore some of the particulars that are TBD, and then a lot of work to update many of the tests. There are some follow-on / stretch goals: One is ensuring that there's a Docker based test mode that can run from the Gradle build and that which would run in CI. Another is exploring how tests might easily define configuration aspects in the code instead of polluting more and more test configuration files that are difficult to maintain. This revolution in test infrastructure is highly appreciated work by not just Solr developers but anyone who writes Solr plugins that use the framework! Difficulty: Medium – The most difficult aspect is in digesting a large codebase, even if mostly limited to the test aspect.  The actual work/code is probably not difficult.  There will be many tests to potentially update. Language: Java Mentor: David Smiley
            afrinjaman Afrin Jaman added a comment - - edited

            Hello, Everyone. I am Afrin Jaman and I am from Bangladesh. I have been selected for the first phase of outreachy internship. To be selected as an outreachy intern I need to pick a project and record my first contribution. I have worked with Apache previously and this project looks interesting to me. I am willing to work with Apache Solr during my internship period. 
            I need dsmiley's help to record my first contribution.

            Happy Coding

            afrinjaman Afrin Jaman added a comment - - edited Hello, Everyone. I am Afrin Jaman and I am from Bangladesh. I have been selected for the first phase of outreachy internship. To be selected as an outreachy intern I need to pick a project and record my first contribution. I have worked with Apache previously and this project looks interesting to me. I am willing to work with Apache Solr during my internship period.  I need dsmiley 's help to record my first contribution. Happy Coding
            joshgogz Joshua Ouma added a comment -

            Hi dsmiley I'm interested in working on this project as an Outreachy intern. As part of the application process, applicants are required to submit a timeline of tasks for the 13 weeks internship period. To come up with the timeline we are advised to discuss with our mentors. The application closing period is approaching thus I would like to come up with the timeline. Are there specific tasks or factors that I should consider while coming up with the timeline?

            joshgogz Joshua Ouma added a comment - Hi dsmiley I'm interested in working on this project as an Outreachy intern. As part of the application process, applicants are required to submit a timeline of tasks for the 13 weeks internship period. To come up with the timeline we are advised to discuss with our mentors. The application closing period is approaching thus I would like to come up with the timeline. Are there specific tasks or factors that I should consider while coming up with the timeline?
            dsmiley David Smiley added a comment -

            Hello applicants.  Let's keep this issue free of GSOC/Outreachy program logistics. I can be reached directly for that, as the mentor. I'll send you an email Joshua.

            dsmiley David Smiley added a comment - Hello applicants.  Let's keep this issue free of GSOC/Outreachy program logistics. I can be reached directly for that, as the mentor. I'll send you an email Joshua.
            dsmiley David Smiley added a comment -

            Note to self/others:  a test that best illustrates where a generic/common initialization would be valuable across Cloud & Jett is TestRawTransformer.  It manually sets up Cloud & Jetty; ideally it'd be just one configuration, and less of it.

            dsmiley David Smiley added a comment - Note to self/others:  a test that best illustrates where a generic/common initialization would be valuable across Cloud & Jett is TestRawTransformer.  It manually sets up Cloud & Jetty; ideally it'd be just one configuration, and less of it.

            People

              Unassigned Unassigned
              dsmiley David Smiley
              Votes:
              0 Vote for this issue
              Watchers:
              13 Start watching this issue

              Dates

                Created:
                Updated:

                Time Tracking

                  Estimated:
                  Original Estimate - Not Specified
                  Not Specified
                  Remaining:
                  Remaining Estimate - 0h
                  0h
                  Logged:
                  Time Spent - 15.5h
                  15.5h