I regularly (but not always) get this error
There is already an open DataReader associated with this Command which must be closed first
when running a query shown in GetRecentClients
. After I looked it up, I've found out that it has to do something with 2 connections being opened at the same time and that turning MARS on should help, but I would prefer not to do it.
Is there any other way around it, and where exactly this second connection comes from? There's no includes or anything like that going on, the meeting is a pretty straightforward entity with only basic type properties.
public static async Task<RecentClientsModel> GetRecentClients(int managerId, IUnitOfWork unitOfWork)
{
var recentViews = (await unitOfWork.GetRepository<Meeting>().Get(
source => source
.Where(a => a.Type == StatusType.Viewed && a.ManagerId == managerId)
.GroupBy(c => c.ClientId)
.Select(gr => gr.OrderByDescending(g => g.Date).FirstOrDefault())
.OrderByDescending(a => a.Date)
.Take(10))).ToArray();
//...
}
public class Meeting
{
public int Id { get; set; }
public DateTime Date { get; set; }
public StatusType Type { get; set; }
public int? ClientId { get; set; }
public int? ManagerId { get; set; }
}
public async Task<IEnumerable<T>> Get(Func<IQueryable<T>, IQueryable<T>> queryBuilder)
{
return await queryBuilder(_context.Set<T>()).ToListAsync();
}
Found this answer: Entity Framework: There is already an open DataReader associated with this Command
Which says that it's probably triggered due to nested queries or lazy loading. It sounds like the only way to fix this is to not use lazy loading or to enable MARS (Multiple Active Result Sets).
In your case, it's probably not an actual second connection, but another result set coming from the same connection. I also wonder if it could be due to the ToListAsync()
.
Actually, now that I look at it again, it could be something with the gr.OrderByDescending()
call in the middle of the query? That might be attempting to enumerate the result set in the middle of the outer result set. Maybe try to rewrite the query (even incorrectly) to avoid something like that, and see if the problem goes away.