I have producer consumer application. Basically its HP Vertica's UDTF where I am getting data from Database. Then I am passing it to Dispatcher for next functionality. But while passing data, it needs to be copied to make list of final collection to be passed in queue.
pseudo Code snippet:
do {
//Model T from row
list.add(t)
if(size of batch matched){
// copy list into new final arraylist
final batch = new Arralylist(list);
dispatcher.submit( new BatchProcessor() {
public list getBatch() { return batch; }})
}
} while(till have data, few million records)
In above snippet, race condition gets created and I get exception :
Java out of Memory : garbage collector overhead limit reached.
Solutions tried:
Can you please suggest me solution for it?
Thanks!
EDIT
Some more info regarding env:
I do not have any custom settings for JVM. But I guess I need to do some. I am not getting which should I try. Main issue I feel is copying of data while sending to inner class. can we avoid it? if yes, how can we?
Problem faced in above case is solved by limiting size of Blocking Queue.
Data submitted to queue was definitely of high size, but in such case queue size used should be minimal or equal to number of thread count. This way we can limit size of idle java objects. Objects in such queue are in pipeline stage and does not participating in active processing.
I think, main reason behind solution is Blocking queue's put method internally handles wait-notify protocol; which stops creation of new objects.