I'm working on a tool that generates and runs SQL against a MySQL database. I'm trying to add the ability to limit the size of a insert statement based on a user-specified setting. I found some info about the max_allowed_packet
setting, but I haven't found anything that explains how to calculate a packet's size. The MySQL docs say that "A communication packet is a single SQL statement sent to the MySQL server," but I don't understand how the size of the SQL statement is determined.
I'm passing a SQL statement and some parameters to my driver:
cmd.CommandText= "INSERT INTO myTable VALUES(@int, @date, @text)";
cmd.Parameters.AddWithValue("@int", 1);
cmd.Parameters.AddWithValue("@date", DateTime.Now);
cmd.Parameters.AddWithValue("@text", "how big is this???");
cmd.ExecuteNonQuery();
Is the size of the packet just the sum of the bytes used in the string representation of all the variables and the SQL statement? Or should I figure out the size of each parameter based on how much space they would take up in the database and sum that with the bytes used in the SQL string? Or something else? I'm not really sure how database drivers work, so maybe some context around that would be helpful, if you know anything about them.
Turns out that the max_allowed_packet is compared to the size of the resulting query that was sent across the wire in ASCII encoding. I just subbed the parameter values into the SQL statement instead of the parameter names and took the size of the resulting string. The number of bytes was a spot-on prediction of whether or not the MySQL would reject the statement, so I could figure out when to start a new statement based on that number.