Best practices for updates
dan.langevin at lifebooker.com
Thu Nov 18 16:34:46 EST 2010
We are considering using Riak and I am looking for some advice on best
practices for updating a large number of records. Our problem is that we
have a large number of unique sets of results, each of which represent a
schedule for multiple people on a particular day or set of days.
For example, we would have a result set for Mondays from 11/01 - 11/28 and
another for 11/29 and then a third on Mondays from 11/30 onwards if one
person's schedule was edited for 11/29.
So when a person's schedule changes, we need to update the related result
sets. The issue is that there are likely to be a large number of result
sets that need to be updated for each change to a person's schedule and I
would like to process them in parallel. From what I have seen, the only way
to do this is to write a MapReduce (or map really I guess) function in
Erlang, though the forum thread I found seemed to discourage this approach.
Is that correct or am I missing something? This seems like a relatively
common use-case for sharded data, so I am hoping that Riak offers a good
solution for us.
Thanks in advance,
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the riak-users