I'm currently experimenting with prefetching pages to increase perceived performance of our website, using the code below (req. jQuery).
Only 0.5% of our visitors use dial-up, I'm excluding querystrings (good old times), externals links (http) and pdfs (our large files are in this format). On a production site, what other possible negative scenario's apply when prefetching that I haven't considered?
<script type="text/javascript">
$(document).ready(function() {
$("a").each(
function(){
$(this).bind ("mouseover", function() {
var href=$(this).attr('href');
if (
(href.indexOf('?') == -1)&&
(href.indexOf('http:') ==-1)&&
($(this).hasClass('nopreload') == false)&&
(href.indexOf('.pdf') == -1)
) {
$.ajax({ url:href, cache:true, dataType:"text" });
}
});
$(this).bind ("mousedown", function(btn) {
if (btn.which==1) {
var href=$(this).attr('href');
if ($(this).hasClass('nopreload') == false) {
window.location.href = href;
return false;
}
}
});
});
});
</script>
For certain links, when hovered over it will preload the page and on mousedown will navigate (rather then after the button is released).
A right click will trigger a mouse down event too - so you might want to check the events data.
I guess that the speed gain for html source of 20-30kb is rather low. Your function does not preload any image, css or js files but only the pure html code.