XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • None
    • None
    • None
    • None

    Description

      Adding the linked list operation of deleting nodes would detect deleted data coming back. Could create something similar to the walker that does the following.

      1. selects a random node X
      2. follows the linked list for a random number of times and stops at node Y
      3. makes X point Y
      4. deletes all nodes that were between X and Y in the list

      For example given the following linked list

         7->5->29->13->19->23->17
      

      If 5 were picked as the first node and 23 as the last node, then the following operations would be done.

      1. write 5->23
      2. flush
      3. delete 29
      4. flush
      5. delete 13
      6. flush
      7. delete 19
      8. flush
      9. do batch read and/or scan to verify deletes

      If 29 or 13 should come back, then the nodes they point to would not exist and verification would catch this. I think the operations above are done in such a way that the delete client could be killed at any time.

      Since continuous ingest works w/ random number there is a small chance that the delete client could delete a node just written by another client. With 63 bit random numbers this chance is exceedingly small. Should it occur the person debugging should be able to sort it out when looking at the write ahead logs. Therefore I do not think its worthwhile taking any action in the test.

      Attachments

        Activity

          People

            Unassigned Unassigned
            kturner Keith Turner
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: