Search code examples
playframework-2.0akkafuturesprayexecutioncontext

Does Scala future blocks anyway for a long operation?


Everywhere we can read that that when performing long running operation or blocking operation it is preferable to use a special Execution context for the matter. Blocking operation like accessing the database. I understand why. It is to avoid Thread starvation. We do not want to have the "8" available threads busy with some blocking code which may eventually return or keep blocking. It either seriously slow down the application or block it indefinitely.

Meanwhile, I'm wondering how things like Spray or Play are implemented. Indeed, let's take the client side. When a request is sent we get a future response. In other words, the request is executed asynchronously. This may by the way end up being a long running operation. However, there is nothing that says launching many request could lead to thread starvation in theses case. Hence i'm wondering why in that case it is not a problem. Do they have special Thread Pool.

I red in the Book "Learning concurrent programing in Scala" that using the "Blocking {}" statement block in a Future helps its scheduler to spawn more thread automatically. Could it be the way they handle it?

The same thing could be said for receiving request, in play we get to execute an Async Action. If one wants to access a database from this action, shall one use the "Blocking {}" statement block. How to execute that action is a special threadPool/ExecutionContext.

My assumption here is that they rely on the implicit.global ExecutionContext. Maybe i'm wrong. The Bottom line is. Making request is a long operation by default, how using spray for instance in your code, would handle it such that not to create a Thread Starvation in your code ?

Are we using different ExecutionContext ?

Edit: Just spotted This short presentation Don't Block - How to Mess Up Akka and Spray that happens to better illustrate the problem that I came with here.

In any case i would appreciate to have other opinions

Edit: Here is what i learned that somehow is happening when using a future:

def apply[T](body: =>T): Future[T] = impl.Future(body)  //here I have omitted the implicit ExecutorContext
impl.Future is an implementation of Future trait:

def apply[T](body: =>T)(implicit executor: ExecutionContext): scala.concurrent.Future[T] =
{
  val runnable = new PromiseCompletingRunnable(body)
  executor.prepare.execute(runnable)
  runnable.promise.future
}

Where PromiseCompletingRunnable looks like this:

class PromiseCompletingRunnable[T](body: => T) extends Runnable {
val promise = new Promise.DefaultPromise[T]()

override def run() = {
  promise complete {
    try Success(body) catch { case NonFatal(e) => Failure(e) }
  }
} } 

Taken from: Clarification needed about futures and promises in Scala I red something simpler and similar in the book "Learning concurrent programing in Scala"

This to me means: there is a Thread in a ThreadPool, that dequeue that task and try to set a promise future value with the result of the execution of that task. If that is correct, i don't see how that task making an IO call does not block the run of that Thread.


Solution

  • Take I/O operations as an example, I think the only missing link is I/O multiplexing, which can be implemented by epoll or kqueue.

    By using epoll/kqueue, one thread can be waiting on many I/O events simultaneously, this thread is waiting (starving) if none of the I/O responses, but you see only this thread is waiting.

    Both nginx and nodejs are using this working mode.