Error while importing data
sharmanitishdutt at gmail.com
Sat Nov 19 07:22:10 EST 2011
To give my Riak setup a good stress testing, I decided to import a large
dataset (consisting of around 160 million records). But before importing
the whole thing, I tested the import python script (using protocol buffers)
using 1 million records, which was successful with ~2200 writes/sec. The
script, essentially, puts the data into a queue and couple of threads gets
the data from the queue and store it in Riak.
When started with full dataset, after storing several million objects, I
get thread exception with timeout errors.
Following is the traceback:
File "/usr/lib/python2.7/threading.py", line 552, in __bootstrap_inner
File "/usr/lib/python2.7/threading.py", line 505, in run
File "python_load_data.py", line 23, in worker
line 296, in store
Result = t.put(self, w, dw, return_body)
line 188, in put
msg_code, resp = self.recv_msg()
line 370, in recv_msg
The cluster consists of 3 nodes (Ubuntu 10.04). The nodes have enough disk
space; number of file handles used (~2500) are also within limit (32768);
number of concurrent ports 32768. I cant figure out what else could be the
possible reason for the exceptions.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the riak-users