Search code examples
ruby-on-railsamazon-s3shrineamazon-elastic-transcoder

amazon elastic transcode with shrine


I am working on an app that require to upload videos. I added Shrine and s3 storage.

Till here everything is working. Now I need to transcode the videos and I added the following code to the video_uploader file

class VideoUploader < Shrine

  plugin :processing
  plugin :versions

  process(:store) do |io|

    transcoder = Aws::ElasticTranscoder::Client.new(
      access_key_id:     ENV['AWS_ACCESS_KEY_ID'],
      secret_access_key: ENV['AWS_SECRET_ACCESS_KEY'],
      region:            'us-east-1',
    )

    pipeline = transcoder.create_pipeline(options = {
        :name => "name",
        :input_bucket => "bucket",
        :output_bucket => "bucket",
        :role => "arn:aws:iam::XXXXX:role/Elastic_Transcoder_Default_Role",
    })

    PIPELINE_ID =  pipeline[:pipeline][:id]

    transcode_hd = transcoder.create_job({
      :pipeline_id=>PIPELINE_ID,
      :input=> {
        :key=> "cache/"+io.id,
        :frame_rate=> "auto",
        :resolution => "auto",
        :aspect_ratio => "auto",
        :container => 'auto'
      },
      :outputs=>[{
        :key=>"store/"+io.id,
        :preset_id=>"1351620000001-000010",
      }]
    })

  end

end 

The transcoding is working and basically is transcoding the new file uploaded to cache folder and put in the store folder with the same name.

The issue now is to attach this file to the record in the database. As of now the record is updated with a different name it creates a new file in the store folder of 0mb.

How can I attach the results of processing into Shrine's uploaded file for storage?


Solution

  • The process(:store) block expects you to return a file for Shrine to upload to permanent storage, so this flow won't work Amazon Elastic Transcoder, because Amazon Elastic Transcoder is now the one that will upload the cached file to permanent storage.

    You can delay the transcoding request into a background job, poll the transcoding job every N seconds, and create a Shrine::UploadedFile from the results and update the record. Something like the following should work:

    # superclass for all uploaders that use Amazon Elastic Transcoder
    class TranscoderUploader < Shrine
      plugin :backgrounding
      Attacher.promote { |data| TranscodeJob.perform_async(data) }
    end
    
    class VideoUploader < TranscoderUploader
      plugin :versions
    end
    
    class TranscodeJob
      include Sidekiq::Worker
    
      def perform(data)
        attacher = TranscoderUploader::Attacher.load(data)
        cached_file = attacher.get #=> #<Shrine::UploadedFile>
    
        # create transcoding job, use `cached_file.id`
    
        transcoder.wait_until(:job_complete, id: job.id)
        response = transcoder.read_job(id: job.id)
        output = response.output
    
        versions = {
          video: attacher.shrine_class::UploadedFile.new(
            "id" => cached_file.id,
            "storage" => "store",
            "metadata" => {
              "width" => output.width,
              "height" => output.height,
              # ...
            }
          ),
          ...
        }
    
        attacher.swap(versions)
      end
    end
    

    If you'll by any chance be interested in making a Shrine plugin for Amazon Elastic Transcoder, take a look at shrine-transloadit which provides integration for Transloadit, which uses practically the same flow as the Amazon Elastic Transcoder, and it works with webhooks rather than polling for the response.