Search code examples
mysqlsqlindexingquery-optimization

Fetching row wrt a 'text' column field from a very large Mysql DB


I have a usecase where i have to fetch data (row) from a large table (contains million entries) with filter on a 'text' column field. Problem is in normal iteration following simple query on this table is timing out.

select * from st_overflow_tbl where uniue_txt_id = '123456'

I ran explain command and found that there is no indexed on unique_txt_id. output below.

"id","select_type","table","type","possible_keys","key","key_len","ref","rows","Extra"
1,"SIMPLE","st_overflow_tbl","ALL",NULL,NULL,NULL,NULL,12063881,"Using where"

I, then, tried to create index for this table but it is failing with following error message -

BLOB/TEXT column 'uniue_txt_id' used in key specification without a key length

Command that i was running is this -

alter table st_overflow_tbl add index uti_idx (uniue_txt_id)

With scenarios mentioned above questions that i have are -

  1. Is there any better way (than just creating indexes on table) to fetch data where we are searching 'text' field on very large table.
  2. How can i create index on this table? What i am doing wrong?
  3. More specific to my usecase. If i cannot create index on this table. Still can i be able to fetch data fast from this table with same select query (mentioned above)? Lets say unique_txt_id field will always be numeric (well, it is till now).

Solution

  • If the column only has a handful of characters, then change the column to an appropriate type, such as varchar(32).

    If you don't want to do that, you can specify an initial length to index:

    alter table st_overflow_tbl add index uti_idx (uniue_txt_id(32))
    

    And, finally, if it really is a text column that contains words and so on, then you can investigate a full text index.