riak locking and out of memory

Ron Yang b0b0b0b at gmail.com
Thu May 26 16:33:23 EDT 2011


Hi, I am playing around with riak and have set up a 2 node cluster.
One node on a desktop ubuntu, and the other node on a macbook pro os/x
10.5.

On the macbook I looped across 400meg files using bash and curl to
upload them as documents into a bucket:
    for a in *.gz; do curl -v http://127.0.0.1:8098/riak/bub/$a
--data-binary @$a; done

While this was happening I poked at the local riak server on the
desktop using curl.  Curiously, when I issued this command on node 1:
    curl -v http://localhost:8098/riak/bub?keys=true

it blocked, apparently waiting for the current file to finish
uploading on node 2.  I thought there wasn't any locking?  Was it
waiting for a quorum?

When I came back from lunch I found that node 2 died not long after with this:
binary_alloc: Cannot allocate 438689176 bytes of memory (of type "binary").
(But I see in the FAQ that 50MB is the recommended largest document size)

Are there any tracing or debugging facilities that I can use to
diagnose latencies or execution plans?

Thanks.




More information about the riak-users mailing list