We use solrnet in our ASP.NET Core backend. In the startup.cs
it is initialized like:
public void ConfigureServices(IServiceCollection services)
{
services.AddSolrNet<SolrPunt>(solrEndpointAddressForPunt, setup =>
{
setup.HttpClient.DefaultRequestHeaders.Authorization = new System.Net.Http.Headers.AuthenticationHeaderValue("Basic", credentialsBase64);
});
}
The search code itself is quite simple. We inject an instance of ISolrOperations<solrEndpointAddressForPunt>
, and use it accordingly:
var queryOptions = new QueryOptions
{
FilterQueries = new List<ISolrQuery>()
};
// FilterQueries are added
var solrResponse = await solrOperationsVoorPunten.QueryAsync("*:*", queryOptions);
This all works fine and we receive results correctly.
However, if we add a lot of filter queries, like more than 105, it throws an exception:
Data at the root level is invalid. Line 1, position 1
I'm wondering why this is and how this can be fixed.
It has nothing to do with XML/JSON setting, as it works perfectly with a small amount of filter queries.
I'm thinking it has to do with the limit of a GET request. But then I read that normally it should then switch to a POST automatically? Can I tell solrnet to always use POST? And if so, how to do that in ASP.NET core?
Or is there another limitation that is causing this?
Thanks for any advice!
I think I found it. Instead of concatenating the fields in a for loop:
solrQueryByField = solrQueryByField || new SolrQueryByField(filterVeld, filterValue);
... I simply use SolrQueryInList
:
queryOptions.FilterQueries.Add(new SolrQueryInList(filterVeld, filterValues));
With this change, it started working...