Search code examples
httpparallel-processinghttpresponsehttpserverpipelining

HTTP 1.1 pipelining and processing order


I am currently implementing http pipelining in my HTTP 1.1 server. The response order corresponds to the order in which the requests are received. But can/should the PROCESSING of the individual requests be parallel, or should it also correspond to the order in which the requests were received? I'm thinking of interdependent requests within the pipe, where the execution order would be relevant. What is the most sensible approach/best practise here? From the server's point of view, parallel processing and serialization of responses according to the request sequence is the best solution for performance. But this could lead to unexpected behavior if the client sends interdependent requests one after the other.


Solution

  • I think the ordering/consistency concerns are exclusively in the application domain, which makes it just a documenting that design choice.

    In my experience, the best thing to handle interfaces that allow "surprise race conditions" for performance, is to make it explicitly clear in the documentation.

    The best way to enable your users to Do The Right Thing is usually to give them an explicit way to control execution order in case it matters.

    Advice?

    I think it's simpler to follow the principle of least surprise:

    • always process requests in order of arrival per connection

    That way, users can always achieve parallelization by opening independent connections, and unrelated sessions will already be parallel by default.

    If independent users may create conflicting updates you need atomic transactions.