There's following web method that is being used in our legacy web service.
[WebMethod(Description = "Get Bulk Data")]
public string[] getBulkData(string[] arrInput)
{
string[] arrResults = new string[arrInput.Length];
using(SqlConnection conn = new SqlConnection(ConfigurationManager.ConnectionStrings["mydb"].ToString()))
{
conn.Open();
//some db operations...
for (int arrIndex = 0; arrIndex < arrInput.Length; arrIndex++) //arrInput size can be very large
{
arrResults[arrIndex] = getResult();
}
}
return arrResults;
}
private string getResult()
{
string r;
using (SqlConnection thisConn = new SqlConnection(ConfigurationManager.ConnectionStrings["mydb2"].ToString()))
{
//some db operations
//assign values to r
}
return r;
}
The web method, getBulkData
, takes an array of string, queries the databse and returns an array of results with same size. The results come from a private method, getResult()
which will also query the database.
I am not sure if it's the right implementation here, but I'd tend to say it's not, because there're multiple querying database, which would probably impact getBulkData's performance. Apparently, the more elements have in that array, the longer processing takes.
If I create thisConn object in getBulkData(), and pass it into getResult, would that make any difference in terms of improving performance? And what'd be the best way to handle a large array in this case?
Database transactions take time to process. If you're querying the database hundreds or thousands of times in a loop, your performance will definitely take a hit. If possible, try to modify the GetResult method to only fire once, but save the results in an property of type IEnumerable. That puts all of the records in your computer's memory and will allow you to run queries against it with LINQ. You'll see a substantial performance increase using this approach. You can see an example here.