Search code examples
mysqldatabaseindexingquery-optimization

Table with 50 million data and adding index takes too much time


I was working on table which has near about 50 million data(2GB-size). I had requirement to optimize the performance. So when I add index on column through phpmyadmin panel, table got lock and result in holding up all queries in queue on that table and ultimately results in restart/kill all queries. (And yeah, I forgot to mention I was doing this on production. My bad!)

When I did some research I found out some solution like creating duplicate table but any alternative method ?


Solution

  • You may follow this steps,

    1. Create a temp table
    2. Creates triggers on the first table (for inserts, updates, deletes) so that they are replicated to the temp table
    3. In small batches, migrate data When done, rename table to new table, and drop the other table

    But as you said you are doing it in production then you need to consider live traffic while dropping a table and creating another one