Search code examples
mysqlalter

How to simulate long running alter query on MySQL


I have a system and I want to test it which executes Alter queries. I'm looking for a way to simulate a long-running Alter query that I can test "panic", "resource usage", "concurrency", ... when it's running.

Is there any way that exists I can simulate a long-running Alter query?

I'm using gh-ost for alter execution.


Solution

  • Here's what I do when I want to test a long-running ALTER TABLE:

    1. Create a table.

    2. Fill it with a few million rows of random data, until it's large enough that ALTER TABLE takes a few minutes. How many rows are required depends on the speed of your computer.

    3. Run ALTER TABLE on it.

    I have not found a better solution, and I've been using MySQL since 2001.

    Here's a trick for filling lots of rows without needing a client app or script:

    mysql> create table mytable (id int auto_increment primary key, t text);
    Query OK, 0 rows affected (0.05 sec)
    
    mysql> insert into mytable (t) select repeat(sha1(rand()), 255) from dual;
    Query OK, 1 row affected (0.02 sec)
    Records: 1  Duplicates: 0  Warnings: 0
    
    mysql> insert into mytable (t) select repeat(sha1(rand()), 255) from mytable;
    Query OK, 1 row affected (0.03 sec)
    Records: 1  Duplicates: 0  Warnings: 0
    
    mysql> insert into mytable (t) select repeat(sha1(rand()), 255) from mytable;
    Query OK, 2 rows affected (0.02 sec)
    Records: 2  Duplicates: 0  Warnings: 0
    
    mysql> insert into mytable (t) select repeat(sha1(rand()), 255) from mytable;
    Query OK, 4 rows affected (0.03 sec)
    Records: 4  Duplicates: 0  Warnings: 0
    
    mysql> insert into mytable (t) select repeat(sha1(rand()), 255) from mytable;
    Query OK, 8 rows affected (0.03 sec)
    Records: 8  Duplicates: 0  Warnings: 0
    
    mysql> insert into mytable (t) select repeat(sha1(rand()), 255) from mytable;
    Query OK, 16 rows affected (0.03 sec)
    Records: 16  Duplicates: 0  Warnings: 0
    

    Now I have 32 rows (16+8+4+2+1+1). I can continue the same query as many times as I want, which doubles the size of the table each time. It doesn't take long before I have a table several gigabytes in size.