I'm using following plugin to upload large files on AWS S3 direct without using any server. https://github.com/blueimp/jQuery-File-Upload/wiki/Upload-directly-to-S3
Its working fine for small files but when I'm using it for large files using "maxChunkSize" parameter as described in its documents, it works but it replaces previous part.
Suppose there is a file of 100 MB and I'm using 10MB chunks to upload it after successfull upload I only received 10 MB on my S3 bucket.
Please help me regarding this issue.
Here is JS code I'm using
$('#file_upload').fileupload({
autoUpload: false,
maxChunkSize: 10000000, // 10 MB => 10000000
add: function (e, data) {
$("#upload_btn").off('click').on('click', function (evt) {
evt.preventDefault();
data.submit();
});
},
send: function (e, data) {
// show a loading spinner because now the form will be submitted to amazon,
// and the file will be directly uploaded there, via an iframe in the background.
$('#loading').show();
},
fail: function (e, data) {
console.log('fail');
console.log(data);
},
done: function (event, data) {
// here you can perform an ajax call to get your documents to display on the screen.
//alert('complete');
// hide the loading spinner that we turned on earlier.
$('#loading').hide();
},
progress: function (e, data) {
var progress = parseInt(data.loaded / data.total * 100, 10);
$('.progress').css('width', progress + '%');
}
});
As I checked blueimp documentation available on internet its not able to upload files from browser to AWS S3 in chunks.
There is a better library available for upload files directly from browser to AWS S3 in chunks.
https://github.com/TTLabs/EvaporateJS
I've tested it with 2 GB file and its working fine.
I'll add a working example in a few days.