Large buckets with Secondary Index

Yousuf Fauzan yousuffauzan at gmail.com
Wed Aug 22 01:22:51 EDT 2012


Hello,

So I tried populating data on 1.2

The way I store is, for each new date, I create a bucket A-<date> and store
the main data here as follows
(K1,V1), (K2, V2)....
Then in another bucket B, I add links for each of the above keys
So, K1 contains links for (A-<old date1>, K1), (A-<old date2>, K1), ...
(A-<current date>, K1)
As time passes, the size of links associated with each key in B increases.
However, it will stabilize to a certain number eventually (i.e. its not
gonna grow indefinitely)

The decrease in throughput occurs in the bucket B.

I think this decrease occurs because I am fetching K1 from B, adding a new
link, and then storing it.

Is there a way to append link to a key in a bucket without fetching it? If
not, is it possible to add such a feature?

--
Yousuf
http://fauzism.com

On Thu, Aug 2, 2012 at 9:06 PM, Yousuf Fauzan <yousuffauzan at gmail.com>wrote:

> Unfortunately No.
>
> I am still using 1.1.
>
> However, once I push my current implementation out, I will look into 1.2.
> Will update you guys then.
>
> --
> Yousuf
> http://fauzism.com
>
> On Tue, Jul 31, 2012 at 12:25 PM, Matthew Tovbin <matthew at tovbin.com>wrote:
>
>> Yousuf,
>>
>> Thanks for the update! Did you try to reproduce with 1.2.X?
>>
>>
>> -Matthew
>>
>>
>> On Sun, Jul 1, 2012 at 1:08 PM, Mik Quinlan <mik.quinlan at gmail.com>
>> wrote:
>> > Hi, do the LevelDB buffer size settings write_buffer_size_min and
>> > write_buffer_size_max make a different to point 7?
>> >
>> >
>> > _______________________________________________
>> > riak-users mailing list
>> > riak-users at lists.basho.com
>> > http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com
>> >
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.basho.com/pipermail/riak-users_lists.basho.com/attachments/20120822/b3dedaeb/attachment.html>


More information about the riak-users mailing list