Search code examples
sql.nettimeoutsqldatareadersqlcommand

In a `SqlCommand`, what timespan does the `CommandTimeout` actually represent?


I am querying a Microsoft SQL Database using the SqlConnection, SqlCommand and SqlReader, much like described in https://learn.microsoft.com/en-us/dotnet/framework/data/adonet/retrieving-data-using-a-datareader

Now, an SqlCommand allows to set the CommandTimeout, and I am doing it like so (simplified):

using (SqlConnection connection = GetConnection()) {
    connection.Open();
    using (SqlCommand command = connection.CreateCommand()) {

        command.CommandText = query; //My custom SQL query
        command.CommandType = CommandType.Text;

        //Set Timeout
        command.CommandTimeout = timeout.Value; //My custom timeout

        using (SqlDataReader reader = command.ExecuteReader()) {
            while (reader.Read()) {
                //Read row by row and do stuff
            }
        }
    }
}

My question is, to what does the Timeout actually apply to? Is it

  • The time spent in ExecuteReader()
  • The time spent in Read()
  • both of those
  • something else?

There seems no documentation of this specifically, neither in MSDN nor on the web.


Solution

  • It will apply to the time executing the command against the database which is command.ExecuteReader() which is complete by the time you start reading the records. You could test that by putting a break point in after that on the while and and sitting there - you won't hit your timeout.