Search code examples
wpfentity-frameworkoptimizationinotifypropertychanged

Application of Entity Framework to scientific (numerical) processing program


I have a thought provoking question, in relation to the use Entity Framework (EF) to persist data. My (intended) application is not a typical LOB scenario where the standard CRUD operations are done on individual records.

Instead, what I would like to use the data stored within my entities, create some Matrices by combining the data on several Entities and do some intensive numerical math. Throughout this intensive process, properties upon the Entities will be continually accessed and updated.

My concern is that the act of accessing/updating the properties on EF Entities will severely reduce the speed of the entire operation, due to all the Lazy Loading, the NotifyPropertyChanged and PropertyChanged, PropertyChanging function calls and the calls to the SaveChanges function on the EF context object... with respect to standard C# objects.

Any thoughts on how to mitigate the speed issues, at the expense of some of the niceties that EF offers?

Regards, LiamV


Solution

  • Don't prematurely optimize. Test it and see. Lazy loading can be turned off, and change tracking isn't a huge overhead. Yes, you can use POCOs if need be, but it would be a huge mistake to make such a decision on the basis of imagined performance problems.

    That said, I think it's a good decision from a dependency-management point of view to not make business logic dependent on persistent storage. You don't need to use POCO entities to do this, though; you can project onto business types with any kind of entity.