Uploaded image for project: 'Apache Cassandra'
  1. Apache Cassandra
  2. CASSANDRA-12418

sstabledump JSON fails after row tombstone

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Normal
    • Resolution: Fixed
    • 3.0.9, 3.11.1
    • Legacy/Tools
    • None
    • Normal

    Description

      sstabledump fails in JSON generation on an sstable containing a row deletion, using Cassandra 3.10-SNAPSHOT accf7a4724e244d6f1ba921cb11d2554dbb54a76 from 2016-07-26.

      There are two exceptions displayed:

      • Fatal error parsing partition: aye org.codehaus.jackson.JsonGenerationException: Can not start an object, expecting field name
      • org.codehaus.jackson.JsonGenerationException: Current context not an ARRAY but OBJECT

      Steps to reproduce:

      cqlsh> create KEYSPACE foo WITH replication = {'class': 'SimpleStrategy', 'replication_factor': 1};
      cqlsh> create TABLE foo.bar (id text, str text, primary key (id));
      cqlsh> insert into foo.bar (id, str) values ('aye', 'alpha');
      cqlsh> insert into foo.bar (id, str) values ('bee', 'beta');
      cqlsh> delete from foo.bar where id = 'bee';
      cqlsh> insert into foo.bar (id, str) values ('bee', 'beth');
      cqlsh> select * from foo.bar;
      
       id  | str
      -----+-------
       bee |  beth
       aye | alpha
      
      (2 rows)
      cqlsh> 
      

      Now find the sstable:

      $ cassandra/bin/nodetool flush
      $ cassandra/bin/sstableutil foo bar
      [..]
      Listing files...
      [..]
      /home/kw217/cassandra/data/data/foo/bar-407c56f05e1a11e6835def64bf5c656e/mb-1-big-Data.db
      [..]
      

      Now check with sstabledump -d. This works just fine.

      $ cassandra/tools/bin/sstabledump -d /home/kw217/cassandra/data/data/foo/bar-407c56f05e1a11e6835def64bf5c656e/mb-1-big-Data.db
      [bee]@0 deletedAt=1470737827008101, localDeletion=1470737827
      [bee]@0 Row[info=[ts=1470737832405510] ]:  | [str=beth ts=1470737832405510]
      [aye]@31 Row[info=[ts=1470737784401778] ]:  | [str=alpha ts=1470737784401778]
      

      Now run sstabledump. This should work as well, but it fails as follows:

      $ cassandra/tools/bin/sstabledump /home/kw217/cassandra/data/data/foo/bar-407c56f05e1a11e6835def64bf5c656e/mb-1-big-Data.db
      ERROR 10:26:07 Fatal error parsing partition: aye
      org.codehaus.jackson.JsonGenerationException: Can not start an object, expecting field name
      	at org.codehaus.jackson.impl.JsonGeneratorBase._reportError(JsonGeneratorBase.java:480) ~[jackson-core-asl-1.9.2.jar:1.9.2]
      	at org.codehaus.jackson.impl.WriterBasedGenerator._verifyValueWrite(WriterBasedGenerator.java:836) ~[jackson-core-asl-1.9.2.jar:1.9.2]
      	at org.codehaus.jackson.impl.WriterBasedGenerator.writeStartObject(WriterBasedGenerator.java:273) ~[jackson-core-asl-1.9.2.jar:1.9.2]
      	at org.apache.cassandra.tools.JsonTransformer.serializePartition(JsonTransformer.java:181) ~[main/:na]
      	at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184) ~[na:1.8.0_77]
      	at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175) ~[na:1.8.0_77]
      	at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_77]
      	at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_77]
      	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_77]
      	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_77]
      	at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151) ~[na:1.8.0_77]
      	at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174) ~[na:1.8.0_77]
      	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_77]
      	at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418) ~[na:1.8.0_77]
      	at org.apache.cassandra.tools.JsonTransformer.toJson(JsonTransformer.java:99) ~[main/:na]
      	at org.apache.cassandra.tools.SSTableExport.main(SSTableExport.java:237) ~[main/:na]
      [
        {
          "partition" : {
            "key" : [ "bee" ],
            "position" : 0,
            "deletion_info" : { "marked_deleted" : "2016-08-09T10:17:07.008101Z", "local_delete_time" : "2016-08-09T10:17:07Z" }
          }
        }
      ]org.codehaus.jackson.JsonGenerationException: Current context not an ARRAY but OBJECT
      	at org.codehaus.jackson.impl.JsonGeneratorBase._reportError(JsonGeneratorBase.java:480)
      	at org.codehaus.jackson.impl.WriterBasedGenerator.writeEndArray(WriterBasedGenerator.java:257)
      	at org.apache.cassandra.tools.JsonTransformer.toJson(JsonTransformer.java:100)
      	at org.apache.cassandra.tools.SSTableExport.main(SSTableExport.java:237)
      

      If possible, please can this be fixed in the 3.0.x stream as well as trunk?

      Attachments

        Activity

          People

            dbrosius David Brosius
            kw217 Keith Wansbrough
            David Brosius
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: