When I execute an action (upload a post request) on my website from my home network (Slow, less than 1Mb/s upload), the Chrome dev tool shows several XHR requests :
While when I execute the same action but this time from my 4G phone network (Quicker, around 10Mb/s upload), the chrome dev tool shows only one (sometimes 2) XHR requests :
How and why does it decides to split like that? Is it link with the fact that it is transfer-encoding chunked?
or maybe because of the keepalive timeout of 5?
The problem is that when there are those several XHR requests and that I reload the page before all the requests ends up to code 200 (which can take several minutes sometimes), I lose some data.
Thanks.
EDIT:
As asked in the comment, here is the code I use for this part (I tried to clean it a bit by removing the parts not related to the problem, hope it helps)
For exemple, if I update or add a "tag" to my note (my code is a note taking app "Evernote" type), the changed is trigered here in my index.php file :
<div class="name_tags"><span><input onfocus="updateidtags(this);" id="tags'.$row['id'].'" type="text" placeholder="Tags ?" value="'.$row['tags'].'"></input></span></div>
then the JavaScript file where all my functions are is called :
var editing = 0;
var lastudpdate;
var editingnote=-1;
$( document ).ready(function() {
setInterval(function(){ checkedit(); }, 1000);
});
function updateidtags(el)
{
editingnote = el.id.substr(4);
}
function checkedit(){
if(editingnote==-1) return ;
var curdate = new Date();
var curtime = curdate.getTime();
if(editing==1 && curtime-lastudpdate > 1000){
updatenote();
}
}
function updatenote(){
var headi = document.getElementById("inp"+editingnote).value;
var ent = $("#entry"+editingnote).html();
var entcontent = $("#entry"+editingnote).text();
var doss = document.getElementById("dossier"+editingnote).value;
var sousdoss = document.getElementById("sousdoss"+editingnote).value;
var tags = document.getElementById("tags"+editingnote).value;
$.post( "updatenote.php", {pass: app_pass, id: editingnote, dossier: doss, sousdossier: sousdoss, tags: tags, heading: headi, entry: ent, entrycontent: entcontent, now: (new Date().getTime()/1000)-new Date().getTimezoneOffset()*60})
.done(function(data){
if(data=='1'){
editing = 0;
$('#lastupdated'+editingnote).html('Last Saved Today');
}
else{
editing = 0;
$('#lastupdated'+editingnote).html(data);
}
});
$('#newnotes').hide().show(0);
}
and finaly the update php file is called (I removed from it the part where it send data to the database) :
<?php
[...]
$id = $_POST['id'];
$heading = $_POST['heading'];
$entry = $_POST['entry'];
$entrycontent = $_POST['entrycontent'];
$now = $_POST['now'];
$seconds = $now;
$dossier = $_POST['dossier'];
$sousdossier = $_POST['sousdossier'];
$tags = $_POST['tags'];
$filename = "entries/".$dossier."/".$id.".html";
[...]
$str = fread($handle, filesize($filename));
if ($entry != ''){
if (!fwrite($handle, $entry)){
die("Error while writing to html file");
}
}
fclose($handle);
?>
If it can help to see what the entire code looks like, I can provide the Github repo but I wasn't sure if I could do that here.
@Daniel Farrell was right, the problem comes from the fact I send a new request every second which is too short for the previous request to finish when network is too slow. Here is the problematic code part :
$( document ).ready(function() {
setInterval(function(){ checkedit(); }, 1000);
});
It executes the checkedit() function every 1 second and launch updatenote.php which write to a file but on slow network, this operation takes more than a second for some big files so the setInterval function call updatenote.php again before the last one finished. Then it creates a new request.