I have a Tag
model in my application:
public class Tag
{
public Tag() { }
public Tag(string t)
{
Name = t;
Pictures = new List<Picture>();
}
public int Id { get; set; }
[MaxLength(96), Index(IsUnique = true)]
public string Name { get; set; }
public virtual List<Picture> Pictures { get; set; }
}
Is there any way to tell the framework to do ON DUPLICATE DO NOTHING
, when a record with the same Name
already exists? This always throws a Duplicate exception and I'd like it to just ignore it and move on.
Edit: I've changed the EF code across many threads to this:
async Task SaveQueue()
{
for (;;)
{
try
{
await Task.Delay(7500);
dbUploadProgress.Visibility = Visibility.Visible;
dbUploadProgress.Value = 0;
var q = new ObservableCollection<Picture>(manager.Queue.ToList().AsEnumerable());
manager.Queue.Clear();
var imgcount = q.Count();
for (int i = 0; i < imgcount; i++)
{
using (var pc = new PicsContext())
{
var tags = q[i].GetTags();
pc.Sites.Attach(q[i].Site); // I'm actually not sure how to use this.
var curUrl = q[i].Url;
if (pc.Pictures.FirstOrDefault(o => o.Url == curUrl) != null)
continue;
foreach (var t in tags)
{
var tag = pc.Tags.Where(o => o.Name == t).FirstOrDefault();
if (tag == null)
{
tag = new ViewModel.Tag(t);
}
q[i].Tags.Add(tag);
try
{
await pc.SaveChangesAsync();
}
catch { }
}
pc.Pictures.Add(q[i]);
try
{
await pc.SaveChangesAsync();
}
catch { }
dbUploadProgress.Value = (i + 1) / (double)imgcount;
}
}
q.Clear();
MainWindow.AddLog(string.Format("Saved {0} pictures.", imgcount));
dbUploadProgress.Visibility = Visibility.Collapsed;
await Task.Delay(1500);
}
catch { }
if (isStopped)
break;
}
}
The problem still exists, though.
Why don't you ran an dbContext.Get.Any(x=>x.Name==tag.Name) and skip adding the new tag based on it?
I think the problem is that your queue processes messages async, so you can have 2 threads trying to insert same tag because the check if exists logic ran before an insert has been made.
You can fix the problem by processing just 1 queue message at a time (having one worker/thread processing the queue).
Or another work around might be to isolate the tags saving in a try catch and handle the duplicate exception properly (reloading the tag from db and use that one when setting it on the new object).