I am developing a program (Webbased PHP, MySQL DB) which includes database of all cities around the world by 2.5M entries (all in one table). When I try to get the details from it, It took the server (assume localhost, wamp) so much time to search and grab the data. I searched over the internet to find the most optimistic way to use such a large data. One way, I think, is to shrink this table to different tables for every country to make the search process easier.
Could anyone suggest how to handle this and share the knowledge?
EDIT:
The table includes two columns. 1.country
2.city
. I user WHERE country=XX
to make the search process easier, But it takes a long time to process.
DATABASE: (table name: cities
)
country city
US Lakewood Estates
US Lakewood Estates Mobile Home Park
US Lakewood Falls
US Lakewood Forest
US Lakewood Gardens
US Lakewood Harbor
US Lakewood Heights
US Lakewood Highlands
US Lakewood Hills
.
.
.
PHP CODE:
$data = mysql_query( "SELECT * FROM cities where country='".mysql_real_escape_string($_GET['country'])."' ORDER BY country" )
or die( mysql_error() );
$i = 0;
while( $row = mysql_fetch_array( $data )){
if ( strpos( strtolower( $row['city'] ), $q ) !== false ) {
$i++;
$arr[] = $row['city'];
if ($i==10) {break;}
}
}
echo "[";
for ($x=0; $x<count($arr); $x++)
{
echo '{"name":"'.$arr[$x].'"}';
if ($x!=count($arr)-1) {echo ',';}
}
echo "]";
2.5 million entries is actually not that much. You need to add proper indexes depending on the way you retrieve your data (where
clause) then it will be way faster.
Add an index on country like this (first you need to change your column data type to varchar
:
ALTER TABLE cities MODIFY COLUMN country varchar(255);
ALTER TABLE cities
ADD INDEX country_idx (`country`);