Search code examples
iosanimationvideo-streaminghttp-live-streaming

Interactive video in iOS : Is it possible to trigger specific actions in code by tapping discrete parts in the video?


I am asking this because I couldn't find the answer anywhere, at least using the keywords I could think. The most relevant question/answer I've found is : (Create interactive videos in iPad - An app for product demo) . The user Jano replied:

The easiest way to create interactive videos for iOS is to use Apple's HTTP Live Streaming technology. You have to create a video, embed metadata, play it using MPMoviePlayerController or AVPlayerItem, and then display clickable areas in response to metadata notifications.

Metadata should contain coordinates for the element you are tracking, eg: a dress, and a identifier for the product. You overlay this info with a clickable subview that reveals more information about the product. There are several applications of this kind in iTunes, here is one.

Once you get a working product and weeks-time of videos, the most difficult part is to perform motion tracking with the less possible human interaction. One approach is to use Adobe After Effects, another is to code your own solution based on OpenCV.

The example I've found concerning this technology (http://vimeo.com/16455248) showed the automatic addition of NSButtons when the video reaches the meta-tags embedded. My client wants a human body interactive video that pauses at a specific time (maybe using the meta-tags) and reacts to user tapping in an element in video (e.g: imagine a pill inside stomach; after tapping this pill it triggers another pre-rendered video, in a way not transparent to user). I have thought about animations using Cocos2D or Open GL ES, but I lack people who master these technologies.

I didn't quite understand the "motion tracking" reference in the quote above. Jano mentions Adobe After Effects and OpenCV. This motion tracking is like an "UIGestureRecognizer" ? Does it track parts of the video itself or motions initiated by user, as taps ?

I expect I've exposed the question in the most clear form possible. Thank you in advance.


Solution

  • This question is a year old, but I can give you insight into the After Effects question. AE has a feature where you can define an area in a video frame and the software will track that area across the timeline, logging the coordinates at specific intervals. For example, in a video of a person riding a mountain bike, you could select an area around their helmet and AE will log coordinates of the helmet throughout the timeline.

    Since Flash was the most likely target for interactive video, the typical workflow would encode this coordinate data into a Flash video as cue point events (this is the only method I have personally experienced). According to some googling, the data is stored in key frames and can be extracted using scripts.

    More info: http://helpx.adobe.com/after-effects/using/tracking-stabilizing-motion-cs5.html

    Here's a manual method for extracting the data:

    In the timeline panel select the footage and press the U key, all track points keyframes will show up. Here’s the magic, select the Feature Center property of each track point and copy it (Cmd+C for Mac or Ctrl+C for PC)

    Now open any text editor such as TextMate or Notepad and paste the data (Cmd+V for Mac or Ctrl+V for PC)