I'm debugging an existing windows service (written in C#) that needs to be manually restarted every few months because it keeps eating memory.
The service is not very complicated. It requests a json file from an external server, which holds products. Next it parses this json file into a list of products. For each of these products it is checking if this product already exists in the database. If not it will be added if it does exists the properties will be updated.
The database is a PostgreSQL database and we use NHibernate v3.2.0 as ORM.
I've been using JetBrains DotMemory to profile the service when it runs:
The service starts and after 30s it starts doing its work. SnapShot #1 is made before the first run. Snapshot #6 was made after the 5th run. The other snapshots are also made after a run. As you can see after each run the number of objects increases with approx. 60k and the memory used increases with a few MBs after every run.
Looking closer at Snapshot #6, shows the retained size is mostly used by NHibernate session objects:
Here's my OnStart code:
try
{
// Trying to fix certificate errors:
ServicePointManager.ServerCertificateValidationCallback += delegate
{
_logger.Debug("Cert validation work around");
return true;
};
_timer = new Timer(_interval)
{
AutoReset = false // makes it fire only once, restart when work is done to prevent multiple runs
};
_timer.Elapsed += DoServiceWork;
_timer.Start();
}
catch (Exception ex)
{
_logger.Error("Exception in OnStart: " + ex.Message, ex);
}
And my DoServiceWork:
try
{
// Call execute
var processor = new SAPProductProcessor();
processor.Execute();
}
catch (Exception ex)
{
_logger.Error("Error in DoServiceWork", ex);
}
finally
{
// Next round:
_timer.Start();
}
In SAPProductProcessor I use two db calls. Both in a loop. I loop through all products from the JSON file and check if the product is already in the table using the product code:
ProductDto dto;
using (var session = SessionFactory.OpenSession())
{
using (var transaction = session.BeginTransaction(IsolationLevel.ReadCommitted))
{
var criteria = session.CreateCriteria<ProductDto>();
criteria.Add(Restrictions.Eq("Code", code));
dto = criteria.UniqueResult<ProductDto>();
transaction.Commit();
}
}
return dto;
And when the productDto is updated I save it using:
using (var session = SessionFactory.OpenSession())
{
using (var transaction = session.BeginTransaction(IsolationLevel.ReadCommitted))
{
session.SaveOrUpdate(item);
transaction.Commit();
}
}
I'm not sure how to change the code above to stop increasing the memory and the number of object.
I already tried using var session = SessionFactory.GetCurrentSession();
instead of using (var session = SessionFactory.OpenSession())
but that didn't stop the increase of memory.
Update
In the constructor of my data access class MultiSessionFactoryProvider sessionFactoryProvider
is injected. And the base class is called with : base(sessionFactoryProvider.GetFactory("data"))
. This base class has a method BeginSession
:
ISession session = _sessionFactory.GetCurrentSession();
if (session == null)
{
session = _sessionFactory.OpenSession();
ThreadLocalSessionContext.Bind(session);
}
And a EndSession
:
ISession session = ThreadLocalSessionContext.Unbind(_sessionFactory);
if (session != null)
{
session.Close();
}
In my data access class I call base.BeginSession
at the start and base.EndSession
at then end.
The suggestion about the Singleton made me have a closer look at my data access class.
I thought when creating this class with every run would free the NHibernate memory when it runs out of scope. I even added some dispose call in the class' destructor. But that didn't work, or more likely I'm not doing it correctly. I now save my data access class in a static field and re-use it. Now my memory doesn't increase anymore and more important the number of open objects stay the same. I just run the service using DotMemory again for over an hour calling the run around 150 times and the memory of the last snapshot is still around 105MB and the number of object is still 117k and my SessionFactory dictionary is now just 4MB instead of 150*4MB.