duplicate keys in Riak secondary index
chaimpeck at gmail.com
Fri Aug 22 11:51:45 EDT 2014
Could you please explain how that might be?
Just to give some more information… At this point, I am trying to simply purge the bucket and start fresh
I am using the python client, basically like this:
for keys in streaming_bucket.stream_index('$bucket', bucket_name):
for key in keys:
I have been running this over and over, but the objects still persist in the bucket even hours after running it.
Last night I set "delete_mode" to "immediately" so these are probably not tombstones…
I noticed that if I set the bucket's n_val to 1, I get an error 400 for each delete operation, whereas if I leave it at the default then it reports back that the delete operation didn't fail. In either case, the keys do not seem to be deleted from the index.
So, in addition to their being duplicates, it seems that I cannot delete items.
I should not that last night, I tried deleting keys from a specific index (not the general "$bucket" index) and that appeared to work.
If anybody has some tips on how to effectively purge the bucket and start over that would be greatly appreciated. (I cannot delete the in the file-system because we have other buckets that cannot be deleted).
On Aug 22, 2014, at 11:39 AM, Alex De la rosa <alex.rosa.box at gmail.com> wrote:
> Might be siblings?
> On Thu, Aug 21, 2014 at 10:29 PM, Chaim Peck <chaimpeck at gmail.com> wrote:
> I am looking for some clues as to why there might be duplicate keys in a Riak Secondary Index. I am using version 1.4.0.
> riak-users mailing list
> riak-users at lists.basho.com
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the riak-users