Search code examples
c#entity-framework-coreasp.net-core-webapiautomapperef-core-7.0

How do I make EF Core 7 query with multiple navigations/levels efficient with AutoMapping to Dto?


I'm using EF Core 7 for the first time with the context of an endpoint in an ASP.NET Core Web API project.

I have a table of Venues and I am attempting to:

  • Query them
  • Pipe them through AutoMapper to a Dto/public model
  • Return them (serialized by framework)

The problem is that this is extremely slow when .UseLazyLoadingProxies() is used, presumable because AutoMapper is accessing each of the properties for each of the Venue's in the Queryable and this fires individual nav queries.

I've tried to emit this and add .Entity<Venue>().Navigation(x => x.Location).AutoInclude(); to configuration. However, this is not the only property, and I'm going to have to do this for all of them. This is also not particularly maintainable; as it'd be too easy to miss this configuration when adding a new navigable and the performance impact may not easily be noticed.

Is there a way to frontload in a maintainable manor all navigation properties (and it's navigation properties) for a model for a query I know that'll need it?

Query site:

using var db = new VenuesContext(Fetch.Lazy);
var query = db.Venues.AsQueryable();

if (manager != null)
    query = query.Where(v => v.Managers.Contains(manager));

if (dataCenter != null)
    query = query.Where(v => v.Location.DataCenter.ToLower() == dataCenter.ToLower());

if (world != null)
    query = query.Where(v => v.Location.World.ToLower() == world.ToLower());

var complexFiltered = query.AsEnumerable().Where(c => /* alt source ACLs filter */);
var mapped = complexFiltered.Select(this._modelMapper.Map<Venue>);

return mapped;

Venue model:

[Table("Venues", Schema = nameof(Entities.Venues))]
public class Venue : IEntity, ISecurityScoped
{
    [Key] 
    public string Id { get; init; }
    public string Name { get; set; }
    public DateTimeOffset Added { get; set; }

    public virtual Location Location { get; set; } = new ();
    public Uri Website { get; set; }

    [DeleteBehavior(DeleteBehavior.Cascade)] 
    public virtual List<Opening> Openings { get; set; } = new ();
    public virtual List<string> Managers { get; set; } = new ();    
}

My DbContext:

protected override void OnModelCreating(ModelBuilder modelBuilder)
{
    modelBuilder.HasDefaultSchema("Venues");
    modelBuilder.Entity<Venue>().Navigation(x => x.Location).AutoInclude();
    base.OnModelCreating(modelBuilder);
}

protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
    optionsBuilder.UseNpgsql("Meow");
    if (_fetchKind == Fetch.Lazy)
        optionsBuilder.UseLazyLoadingProxies();
}

Solution

  • What is tripping you up is whatever is justifying this code:

    .Where(c => /* alt source ACLs filter */)

    If you can work that out to continue to work as IQueryable then Automapper's ProjectTo will be able to handle the mapping expression in SQL and you can remove any/all eager loading and not have the cost of lazy load hits. Otherwise you will need to ensure all navigation properties that will be touched by Automapper to build the destination class are eager loaded. This is less ideal because it means you are coupling the mapping configuration to each and every query that might use it (The mapping requires certain navigation properties to be eager loaded or it results in expensive lazy load calls) and this can in turn produce expensive Cartesian Products through the JOINs. You can mitigate some of the JOIN expense using .AsSplitQuery() before the AsEnumerable(). This will have the EF DbContext execute separate queries to retrieve the related entities, mitigating the Cartesian Product of a single query. However this still results in all of those related entities being loaded and tracked by the DbContext to be available when the mapper.Map() call is to sift through them.

    The best thing is to do whatever possible to remove the `.AsEnumerable()' even if the resulting expressions look "messy", as long as it can be fed to SQL and understood/correct, the resulting query will be many factors faster and less resource costly than the alternative.