Search code examples
androidcachingstoragediskspaceandroid-assets

Simulating Camera with JPEGs in Android


I have an application where we want the user to either be able to run it using the camera on their phone (and feed the camera bytes into our library) or to use a jpeg sequence and feed those images in instead. We are planning on shipping with a few different image sequences, which will be packaged up into the assets folder on the device. Our API supports loading images either as a file path, or as the actual bytes of the image.

What is the best way to go about feeding in the jpeg images?

As I see it, I have two choices:

  1. Load the bytes of the jpeg directly out of the AssetManager and pass them to my API as bytes. This would mean that I am constantly pulling the bytes of each file of the AssetManager. (I intend to do some profiling on what the cost is of doing this, but maybe someone already knows that the cost is too high.)
  2. Copy all of the images into the cache folder and then load them by filename. It seems to me that performance-wise, this approach is better, but it requires much more overhead up front, and will obviously take up much more space on the user's device. (For now we have a limitation that each image sequence will be less than 30 seconds, so 30fps*30seconds = 900images.)

Thanks for your help.


Solution

  • Ended up using approach #2, which seemed to be ok in the end.