We have a stress test in contrib/py_stress, and Paul has a tool using libcloud to automate running it against an ephemeral cluster of rackspace cloud servers, but to really qualify as "performance regression tests" we need to
- test a wide variety of data types (skinny rows, wide rows, different comparator types, different value byte sizes, etc)
- produce pretty graphs. seriously.
- archive historical data somewhere for comparison (rackspace can provide a VM to host a db for this, if the ASF doesn't have something in place for this kind of thing already)