currently, I'm getting a 2-dimensional PHP array of data by downloading a csv file from an URL and parsing it with str_getcsv()
.
After that, I loop through each line and insert the values to the corresponding SQL columns:
foreach($array as $a=>$value){
//defining variables
$sql = "INSERT INTO Ranking (ra, na, sc, kd, wi, ki, de, sh, ti)
VALUES ('$ra','$na','$sc','$kd','$wi','$ki','$de','$sh','$ti')";
$conn->query($sql);
}
$conn->close();
I have about 16,000 lines of data but weirdly, the insertion stops after exactly 10,999 rows. Thats about 929 kb.
I can't find a reason since my hosting provider states that there is no row limit. I don't think that the execution of the PHP file stops due to an execution time limit because after this insertion command the array also gets stored locally and that happens about 4 seconds after the file gets downloaded.
Does the foreach()
-loop have some kind of a time limit? Does it stop after a certain number of executions? Would it work if I only connected once with just ONE sql query?
UPDATE:
Turns out the sql import time exceedet.
Fatal error: Maximum execution time of 10 seconds exceeded
in /home/mainhost/public_html/phpMyAdmin/libraries/import/sql.php on line 118
(output of manual insertion try in phpmyadmin)
What do i do? split the input to two queries?
Came up with a solution!
Since looping through all of the rows turned out to be not the right way and querying the array as a whole exceeded the import time, I combined them, so that i.e. 6000 rows get imported at once and this worked just fine for me!
foreach($array as $a=>$value){
$i += 1;
if(!fmod($i / 6000, 1) == 1){
$sql = substr($sql, 0, -1);
$conn->query($sql);
$sql = "INSERT INTO Ranking (rank, name, score, kd, wins, kills, deaths, shots, time) VALUES ";
}
//defining variables
$sql .= "('$ra',' $na ','$sc','$kd','$wi','$ki','$de','$sh','$ti'),";
}
$sql = substr($sql, 0, -1);
Have a good day!