I want to implement the producer consumer pattern with a pool of bacon.js event streams. Here's the specific problem I'm trying to solve:
I have a list of 'n' urls. I want to create event streams to make http requests for those urls, but I want to limit it to 'x' streams ('x' network requests) at a time. In the event handler for the above streams, I create a new event stream that writes the http response to a file. But I want to limit the number of streams writing to file to 'y' at a time.
In Gevent/Java, I'd create thread pools of appropriate size and use threads from the appropriate thread pool. How do I do something similar for eventstreams?
Using flatMapWithConcurrencyLimit you'll be able to control the number of spawned streams:
function fetchUsingHttp(url) { .. } // <- returns EventStream of http result
function writeToFile(data) { .. } // <- returns EventStream of file write result
var urls; // <- EventStream of urls
var maxRequests, maxWrites; // <- maximum concurrency limits
var httpResults = urls.flatMapWithConcurrencyLimit(maxRequests, fetchUsingHttp)
var fileWriteResults = httpResults.flatMapWithConcurrencyLimit(maxWrites, writeToFile)
fileWriteResults.log()