I have a huge amount of data in MySQL exactly 10180 lines and its contionusly updating day by day with another 200-300 line. And in my C# app I have a backgroundworker which gets the data and loads into a datatable then this datatable loads into a ListView which shows the data.
The problem is this took approximately 1 minute which in production is too long.I want to fix that and optimize the code.
The code inside the DoWork method:
try
{
connection.Close();
if (lvValidate.InvokeRequired)
{
lvValidate.BeginInvoke(new MethodInvoker(delegate
{
lvValidate.Items.Clear();
}));
}
else
{
lvValidate.Items.Clear();
}
System.Data.DataTable dt = DataTransferDA("SELECT * FROM workhours ORDER BY ID desc");
foreach (System.Data.DataRow row in dt.Rows)
{
ListViewItem tempLv = new ListViewItem(row["ID"].ToString());
tempLv.SubItems.Add(row["Date"].ToString());
tempLv.SubItems.Add(row["Name"].ToString());
tempLv.SubItems.Add(row["WorkCode"].ToString());
tempLv.SubItems.Add(row["Assembly"].ToString());
tempLv.SubItems.Add(row["Tech"].ToString());
tempLv.SubItems.Add(row["Beginning"].ToString());
tempLv.SubItems.Add(row["Ending"].ToString());
tempLv.SubItems.Add(row["Validated"].ToString());
tempLv.SubItems.Add(row["Validated name"].ToString());
lvValidate.BeginInvoke(new MethodInvoker(delegate
{
lvValidate.Items.Add(tempLv);
}));
}
if (lvValidate.InvokeRequired)
{
lvValidate.BeginInvoke(new MethodInvoker(delegate
{
lvValidate.TopItem = lvValidate.Items[topIndex];
}));
}
else
{
lvValidate.TopItem = lvValidate.Items[topIndex];
}
connection.Close();
}
catch (Exception ex) { MessageBox.Show("Error happened: " + ex.Message); connection.Close();} }
If I had to guess its the listview that is causing the issue more than the data size.
I did a small test on a db that has 50284 rows. When I used the listview it took forever to show up and my laptop fan kicked into full tilt even though iterating through the data only took 876ms. I switched the test form to a basic listbox and it only took a couple seconds to display after the data loading in the constructor.
I used EFCore to do the data connection/handling but I can't think this caused a huge change in performance from a raw query.
Data(50284) took 876
int i = 0;
using (EFContext ctx = new EFContext())
{
foreach(data dt in ctx.data.AsNoTracking())
{
i++;
listBox.Items.Add(dt.epc);
}
}
watch.Stop();
System.Diagnostics.Debug.WriteLine($"Data({i}) took {watch.ElapsedMilliseconds}");
[edit] The EFContext is an Entity Framework Core component, but all it does is wrap the database model. I traced it and it just issued a select blah from db.table same as you. It just saves the data in a class format so you can iterate and interact with it like normal code.
I'm currently in Entity Framework mode and had all the mockup code I needed for this test otherwise I would have done a basic SQL connection. The viewer is the issue on this issue not the data size.