I have a self-hosted WCF Data Service (OData) that I'm developing. As I've been testing this, I noticed that most client applications I'm using (Excel, Browsers, etc) timeout on a request to pull a particular query in my service. There are about 140k records in the query. Applications just crash after a long query.
Right now, the only work around is to do client-side paging but if I can simply increase the limit then I would be most grateful for the answer.
Note that my Entity Model is mapped with database Views and not actual tables, just in case it has a relation with the issue.
Cheers!
Do you really need to transfer a so large amount of data?
I think OData is not a protocol for data replication.
The main advantage of OData is the opportunity to query and thus limit the amount of data to be transferred.
In an application that handles a lot of data, a common approach is to first present aggregations then refine querying (depending, for example, of successive choices made by the user).
The AdaptiveLINQ component I developed can help you implement this type of service. This is based on the notion of cube: dimensions and measures are defined as C# expressions.
For example, one can imagine a service to look in a product catalog (containing lots of data) as follows:
List of product categories and for each of them the amount of products available:
http://.../catalogService?$select=Category,ItemQuantity
List of available colors in category "shirt":
http://.../catalogService?$select=Color,ItemQuantity&$filter=Category eq shirt
List of "green shirts":
http://.../catalogService?$select=ProductLabel,ProductID&$filter=Category eq shirt and Color eq green