I have an ASP.NET Web API project and I am using some background service where i update my DB and add new data(~20000 rows) every 8 hours But even for the first iteration of queries my app is using 400mb of memory(though in DB data it is only ~25mb) And memory usage keeps increasing after each iteration.
I have 2 quesitions:
1.Am i doing something wrong with DbContext so it doesnt dispose correctly?
2.If not, how can i decrease memory usage in my application?
I tried to use GC.Collect() and it caused some effect, but it still consumes 660 mb after 3 calls to DB There are photos of memory profiler( red line states for end of one iteration)
Memory usage without GC WithoutGC
Memory usage with GC WithGC
Here some code that reproduces the problem
Program.cs
using EFMemoryExample;
using EFMemoryExample.Data;
using Microsoft.EntityFrameworkCore;
var builder = WebApplication.CreateBuilder(args);
// Add services to the container.
builder.Services.AddControllers();
// Learn more about configuring Swagger/OpenAPI at https://aka.ms/aspnetcore/swashbuckle
builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen();
//Add database
builder.Services.AddDbContextPool<ApplicationDbContext>(options =>
{
//Change SQL connect string for your local db
options.UseMySQL("server = localhost; user = root; database = test_db; password =admin;");
}
);
//Add worker
builder.Services.AddHostedService<RefreshWorker>();
var app = builder.Build();
// Configure the HTTP request pipeline.
if (app.Environment.IsDevelopment())
{
app.UseSwagger();
app.UseSwaggerUI();
}
app.UseHttpsRedirection();
app.UseAuthorization();
app.MapControllers();
app.Run();
RefreshWoker.cs
public class RefreshWorker : BackgroundService
{
private readonly IServiceProvider _provider;
public RefreshWorker(IServiceProvider provider)
{
_provider = provider;
}
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
{
while (!stoppingToken.IsCancellationRequested)
{
try
{
//I am doing some htpps request here but i'll generate some for test
var testResponseData = new List<Item>();
for (int i = 0; i < 20000; i++)
{
var item = new Item()
{
Description = "Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam",
Name = $"{i} Name",
Type = "postItem",
};
testResponseData.Add(item);
}
//All new object from http request should be added to Db
using var scope = _provider.CreateScope();
using var db = scope.ServiceProvider.GetRequiredService<ApplicationDbContext>();
db.Items.AddRange(testResponseData);
db.SaveChanges();
}
catch (Exception ex)
{
Console.Error.WriteLine("Error while refreshing Items: " + ex);
}
//It is supposed to be 8 hours but for example i left 5 sec
await Task.Delay(5000, stoppingToken);
//Delete comment if you want to try with GC
//GC.Collect();
}
}
}
First of all note that running in Debug mode can heavily affect the app's captured memory footprint - see for example this answer and links there (especially about JIT prolonging object lifetime in the debug mode), so try running your app in Release mode.
Next point to consider is that EF Core is not very well suited for bulk inserts. Modification in EF (usually) works based on the change tracking so EF will need to preserve the local collection of entities so it can later save them. If you are running in memory-constrained environment and/or don't want memory spikes for such relatively massive inserts then you can look for some workarounds (like creating a transaction and manually chunking the batch and calling SaveChanges
per chunk and clearing change tracker/recreating context after each chunk, or looking into some 3rd party tools like EFCore.BulkExtensions).