Very often i need to prevent querys to get executed when a value like a email already exist.
Until now i searched for the value like that:
$checkemailexist = $X['db']->prepare("SELECT uid FROM userdata WHERE uid = :uid LIMIT 1");
$checkemailexist->execute(array(
':uid'=>$uid
));
if(empty($checkemailexist)){
INSERT QUERY ..
}
...
The problem on a big database with many rows, a string search even on a varchar can take a lot of performance and time.
So i made the uid column unique and tried something like that:
try{
$insertuser = $X['dbh']->prepare("
INSERT INTO user (uid) VALUES (:uid)
");
$insertuser->execute(array(
':uid'=> $mail
));
} catch (PDOException $e) {
header("Location: ...");
exit();
}
Its working fine, but could the performance even be worse ?
After making uid column an [unique] index, you made all your queries faster. Both queries, either SELECT or INSERT will have to check the index, and it will take them both the same time to perform.
Adding an index to the column used for search for is the real answer to your question. As to whether to use a select query or to catch an exception during insert is a matter of taste.
However, your second example is rather wrong. You shouldn't handle every PDOException the same way but only a specific exception related to this very case, as it's shown in my PDO tutorial.
The best way would be to keep the unique index but add a keyword IGNORE to the query and then check the number of affected rows
$insertuser = $X['dbh']->prepare("INSERT IGNORE INTO user (uid) VALUES (:uid)");
$insertuser->execute(['uid'=> $mail]));
if (!$insertuser->numRows()) {
header("Location: ...");
exit();
}
adding IGNORE would suppress the unique index error, and you will be able to check whether such a value already exists by simply checking the number of affected rows