I'm using a modified version of the Jquery Multi-file uploader from Railscast #383 (http://railscasts.com/episodes/383-uploading-to-amazon-s3) in a Rails 3 app, and I need to tweak it so that it checks if a file already exists on S3, and skips re-uploading it if so.
Some background: my users need to update large chunks of data. For instance, one might select 500 4MB files to upload. Inevitably, their internet connection breaks, and rather than expecting the user to figure out which files uploaded and which didn't, I want them to be able to just select those same 500 files and the app be smart enough to not start again at the very beginning.
The most preferable solution would be to include an option in the S3 POST that says not to overwrite an existing file. Next most preferable would be to fire off a GET to S3 to see if the file exists and skip it if so.
Least preferably, I've implemented a solution that non-asynchronously fires off a GET to my Rails app (because I create a database entry upon completion of each upload), but I seem to be having problems with throttling those requests, and my user says her browser keeps crashing (it does all 500 at once I guess).
Relevant application.js
//= require jquery
//= require jquery_ujs
//= require jquery.ui.all
//= require jquery-fileupload/basic
//= require jquery-fileupload/vendor/tmpl
My form:
<%= s3_uploader_form post: uploaded_photos_path, as: "uploaded_photo[image_url]", photo_shoot_id: @photo_shoot.id do %>
<%= file_field_tag :file, multiple: true %>
<%= button_tag 'Upload Photos', id: 'upload_photo_button', type: 'button' %>
<% end %>
My javascript:
$(function() {
$('#s3_uploader').fileupload({
limitConcurrentUploads: 5,
add: function(e, data) {
var file, record_exists, photo_check_url;
file = data.files[0];
photo_check_url = "/my_route/has_photo_been_uploaded/" + encodeURIComponent(file.name)
// THIS IS MY NON-THROTTLING HACK THAT NEEDS REPLACEMENT/IMPROVEMENT
// THE CONTROLLER THAT HANDLES THE REQUEST JUST RENDERS AN INLINE STRING OF 'true' OR 'false'
$.ajax( {
url: photo_check_url,
async: false,
success: function (result) {
record_exists = result;
}
});
if (record_exists == 'false') {
data.context = $(tmpl("template-upload", file));
$('#s3_uploader').append(data.context);
data.submit();
}
},
progress: function(e, data) { // irrelevant },
done: function(e, data) { // irrelevant. It posts the object to my database }
},
fail: function(e, data) { // irrelevant }
});
});
My Helper:
module S3UploaderHelper
def s3_uploader_form(options = {}, &block)
uploader = S3Uploader.new(options)
form_tag(uploader.url, uploader.form_options) do
uploader.fields.map do |name, value|
hidden_field_tag(name, value)
end.join.html_safe + capture(&block)
end
end
class S3Uploader
def initialize(options)
@options = options.reverse_merge(
id: "s3_uploader",
aws_access_key_id: ENV["S3_ACCESS_KEY"],
aws_secret_access_key: ENV["S3_SECRET_ACCESS_KEY"],
bucket: S3_BUCKET_NAME,
acl: "private",
expiration: 10.hours.from_now.utc,
max_file_size: 20.megabytes,
as: "file"
)
end
def form_options
{
id: @options[:id],
method: "post",
authenticity_token: false,
multipart: true,
data: {
post: @options[:post],
as: @options[:as]
}
}
end
def fields
{
:key => key,
:acl => @options[:acl],
:policy => policy,
:signature => signature,
"AWSAccessKeyId" => @options[:aws_access_key_id],
}
end
def key
@key ||= "uploaded_photos/${filename}"
end
def url
"https://#{@options[:bucket]}.s3.amazonaws.com/"
end
def policy
Base64.encode64(policy_data.to_json).gsub("\n", "")
end
def policy_data
{
expiration: @options[:expiration],
conditions: [
["starts-with", "$utf8", ""],
["starts-with", "$key", ""],
["content-length-range", 0, @options[:max_file_size]],
{bucket: @options[:bucket]},
{acl: @options[:acl]}
]
}
end
def signature
Base64.encode64(
OpenSSL::HMAC.digest(
OpenSSL::Digest::Digest.new('sha1'),
@options[:aws_secret_access_key], policy
)
).gsub("\n", "")
end
end
end
After learning more about AJAX (after it occurred to me in my second comment), it looks like an acceptable solution was indeed to make the AJAX call asynchronous and place the S3 POST code inside its success callback. That solved my browser non-responsiveness issues.
$.ajax( {
url: my_route_to_ask_if_photo_was_already_uploaded,
success: function (result) {
if (result == 'false') {
// ...other code
data.submit();
}
});