Computing value size
julien.genestoux at gmail.com
Fri Apr 26 06:08:38 EDT 2013
We (Superfeedr) are currently diving into Riak to make it our key value
store for our Google Reader API
So far, so good. However, as we're doing tests, we're trying to evaluate
the size that we'll need from Riak.
For that, we obviously need the average size of a value stored.
Of course we can run "estimates", but the data we store is actually quite
diverse and don't have a consistent
schema, so we thought it's probably easier/better to take "real" data into
account rather than theoretical one.
We tried the top down approach: size of disk space/(n_val * number of keys)
but that's not ideal because
we store different types of objects too...
Is there any way to do that from a bottom up approach with a map/reduce job?
*Got a blog? Make following it simple: https://www.subtome.com/
+1 (415) 830 6574
+33 (0)9 70 44 76 29
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the riak-users