I'm wanting to know if it's a bad design to use repositories when converting DTOs to their domain object counterpart.
I'm building an n-tier web app which has a repository layer and a service layer and ef4 for ORM. The service layer exposes DTO versions of the domain objects. When I receive a DTO from a consumer of the service, the service will use AutoMapper to convert the DTO to the domain object. Now, some of the member properties on the domain object will need to be loaded from the database, for example I have the class below -
DTO Version:
public class LogonEventDto
{
public DateTime Time
{
get;
set;
}
public Guid UserId
{
get;
set;
}
}
Domain Version:
public class LogonEvent
{
public DateTime Time
{
get;
set;
}
public User User
{
get;
set;
}
}
Now, when it comes to converting the DTO to the DO version, I will need to call the GetById() method on the UserRepository and set the LogonEvent.User property with the result.
Just to let you know, I'm currently manually doing all the conversion logic in the service layer.
So as I asked above, is this a bad design decision and if so why?
I think its common sense to do it like this. You decouple your inner state representation (domain model) from your service contract (dto/data contracts). This way you do not expose any internals and you can refactor your implementation without affecting your public service contract (assuming the mapping is still possible).
we use this pattern all the time in our SOA(-isch) customer projects. We even have a tool to aid you with generating the mapping code.
I'm not a fan of AutoMapper (although the code is very cool) because it requires you to specify the mapping at runtime (you have to write code to built up the mapping def's). In my view the mapping definition is a design-time thing. That is why we have build a code generator tool.
In my experience I've encountered the following:
Hope it helps. Grtx, Marc