How to best store arbitrarily large Java objects

Henning Verbeek hankipanky at gmail.com
Thu Jul 21 09:36:16 EDT 2016


I have a Java class, which is being stored in Riak. The class contains
a `TreeMap` field, amongst other fields. Out of the box, Riak is
converting the object to/from JSON. Everything works fine.

However, depending on the size of the `TreeMap`, the serialization
output can become rather large, and this limits the usefulness of my
object. In our tests, dealing with Riak-objects >2MB proved to be
significantly slower than dealing with objects <200kB.

So, in order to store/fetch instances of my class with arbitrary
sizes, but with reliable performance, I believe I need to split the
output into separate Riak-objects after serialization, and reassemble
before deserialization.

My idea was to use a converter that splits the serialized JSON into
chunks during _write_, and uses links to point from one chunk to the
next. During _fetch_ the links would be traversed, the JSON string
concatenated from chunks, deserialized and the object would be
returned. Looking at `com.basho.riak.client.api.convert.Converter`, it
seems this is not going to work.

I'm beginning to think that I'll need to remodel my data and use CRDTs
for individual fields such as the `TreeMap`. Would that be a better
way?

Any other recommendations would be much appreciated.

Thanks,
Henning
-- 
My other signature is a regular expression.
http://www.pray4snow.de




More information about the riak-users mailing list