For a project, I need to change the communication between the server and the client.
Actually it's a WCF service, hosted in console application, that send the whole table content to the client, and the client replace all its local cache by the new content.
The client application can work offline(We have a locale cache, which is persistent through serialization on the disk, and we have also a transaction list, which store all changes we have to send to the server once connected).
The problem is that this sync will now happen on GSM connection, and sending the whole table content is really to heavy.
So either I implement myself all this(set a Modification date, store the last sync date, get only updated fields, update my cache), either I can find a standard way to do it. I think that a dedicated library will do a better job, because they have already thought to a lot of scenario that I certainly didn't have.
The sync layer has a low coupling, so it's not a big deal to change the way it works.
My project requirement are:
I've looked a little around to see what exists.
I found some things about Microsoft Sync Framework, but it seems very ADO.Net oriented, and I'm not sure it can takes data from an entity service and put them in a serialized cache.
So:
Can you advise me some framework or library to do the job? Or Do you think that regarding my needs, I should rather implement this myself?
I have used Microsoft Sync Framework in the past for a similar N-Tier occasionally connected client and it worked a treat. It has moved on since I last used it but here is how I see that it answers your requirements.
WCF
Works fine over WCF, this is how we used it. (How to: Configure N-Tier Synchronization)
Should have incremental sync
Sync services does this very well but you may need to add timestamps to your sync tables. Your client database (Which in my case was a SQL Server CE database) holds the last timestamp that was used when you last synced and it will then use this to get everything that was changed afterwards during the next sync.
Can push changes to the server
Again we did this. There are plenty of hooks to provide custom logic on the server to validate the data.
Access/store information offline
Works fine in a completely disconnected scenario (providing the user has already synced their data first). See Offline Scenarios.
Possibility to do a full resync (corrupted cache, or new computer)
It is the client database that holds all the sync information (you can put knowledge onto the server about what the client has synced if you want to). If you delete the local database then the client will do a full synchronization.
The data provider on the server side is Entity Framework
This is not how I have used it but the sync providers are fully customizable. We were going to look at doing an NHibernate one however ask yourself why you want to do this. Out of the box Sync Services will allow you to push and pull data using stored procedures or direct table queries. This is very easy to setup and because you could be using lots of data it is very fast to run and the data is simple to synchronize across the WCF boundary (although we did switch to the binary formatter from the xml formatter for better performance).
What we found was just because we used Entities on the server they did not necessarily make sense on the client and we therefore had a whole new set of entities on the client. This also meant that we were stripping the data out that was not need in the client in the stored procedures. Again this was very easy and you do not need to get dirty with ADO.net. Then once the data is on the client we used NHibernate to read and write to the local database.
Building Custom Sync Providers for the Microsoft Sync Framework
I need to reuse my current custom authentication
If you mean WCF custom authentication then yes as we had our own WCF custom security token which worked fine without any impact.
Bonus: Some types have a special sync(e.g. I've a database containing small files, but on the client side, files must be put directly in the folder)
Short answer I don't know as their is a file sync provider in the new framework which I have not used but you have two other options.
During the synchronization on the client you can actually hook into the point where data is going to be put into tables and you have an opportunity to pull the binary data and write it to the file system instead of the database.
Exclude the file binary data from the synchronization and pull this data down after the synchronization using another process. We did this as the packages we were pulling down were quite large so we used the initial sync to pull down the "meta data" and then we used something called BITS which is part of windows to pull down the files asynchronously.
Introduction to Microsoft Sync Framework File Synchronization Provider
[UPDATE]
In response to the questions raised in the comments.
My application should works for different users on the same computer. In my case, I was using the isolated storage to guarantee that they works in a different location, is this possible with SQL Server CE?
Our application was deployed via ClickOnce which would provide a separate installation of the application for each user but I don't think that is what is being asked. SQL CE is just an in memory database, you point the SqlCeEngine
at the database file you want to load so isolated storage is perfect.
As far as I know, SQL Server CE is like SQLite, how do you manage the schema creation?
You can let Snyc Services create the database schema for you if you want and that will be good enough to get you going but in the long run you are probably going to have to change the schema at some point. This will probably be when you do an upgrade so you are better of thinking about it early on. I dealt with this by not thinking of the database as belonging to sync services but as something sync services is told it can use.
When our application started up it did some house keeping like creating the database if it didn't exist or running database scripts if the application had just been upgraded.
Do you know if there is somewhere an example of class implementation for this N-tier sync? I need to saw what interfaces I've to implement.
The current stable release is 2.1 and I was using it just as 2.0 came out so all my work was in 1.0. Here is a link to the Microsoft Sync Framework 2.1 api on MSDN. I had to use the 1.0 docs to find the examples that I used to drive out our WCF interface so I don't know how far things have changed but you can use this to start with which defines the interface as this:
[ServiceContract] public interface IServiceForSync { [OperationContract()] SyncContext ApplyChanges(SyncGroupMetadata groupMetadata, DataSet dataSet, SyncSession syncSession);
[OperationContract()]
SyncContext GetChanges(SyncGroupMetadata groupMetadata, SyncSession syncSession);
[OperationContract()]
SyncSchema GetSchema(Collection<string> tableNames, SyncSession syncSession);
[OperationContract()]
SyncServerInfo GetServerInfo(SyncSession syncSession);
}
I suppose it's one service per data type?
No, as you can see from above there is only one service. What is Synced is dependent on what you expose on the server and what the client wants to participate in. For example we had two clients, one was only interested in a small subset of the data and only participated in syncing a couple of tables (called a SyncTable
) where as the other synced all the tables.
There is also a concept of a SyncGroup
, these consist of related changes that should be persisted transactionally as they are all related, i.e. if one fails they all fail. You can also sync the groups individually without having to sync everything.
Absolutely. When you do a synchronization you pass a SyncParameter
which contains values that you can use to filter the data being returned to the client. How to: Filter Rows and Columns.