Looks like ExecutorService internally uses a blocking queue.
I am assigning only 2 threads to the ExecutorService like so:
ExecutorService executor = Executors.newFixedThreadPool(2);
What happens if I keep on submitting tasks endlessly? Will the blocking queue prevent more tasks from adding if it is full? Will the memory be hampered by the submission of so many tasks? What are the implications? Asking because I am seeing this happening in my code so want to know what are the side-effects.
Future<> future = executorService.submit(callableTask);
You can see the implementation of ExecutorSevice
inside of static method like Executors.newFixedThreadPool()
public static ExecutorService newFixedThreadPool(int nThreads) {
return new ThreadPoolExecutor(nThreads,
nThreads,
0L,
TimeUnit.MILLISECONDS,
new LinkedBlockingQueue<Runnable>());
}
and the corresponding constructors for ThreadPoolExecutor
and LinkedBlockingQueue
are below.
public ThreadPoolExecutor(int corePoolSize,
int maximumPoolSize,
long keepAliveTime,
TimeUnit unit,
BlockingQueue<Runnable> workQueue) {
this(corePoolSize, maximumPoolSize, keepAliveTime, unit, workQueue,
Executors.defaultThreadFactory(), defaultHandler);
}
public LinkedBlockingQueue() {
this(Integer.MAX_VALUE);
}
Consequently, if you continue to submit tasks to the thread pool (assuming that the tasks you submit take a significant amount of time), it may eventually lead to an OutOfMemoryError (OOM).
However, without knowing the specific situation you are facing, I would not recommend replacing it with Executors.newCachedThreadPool()
since it will keep creating new threads.
Instead, I would advise you to create your own ThreadPoolExecutor
tailored to your needs.