I have an MP4 URL with only video and a separate audio track for it. I can play one or the other by changing the "main" stream URL and the corresponding content-type, but I want both, not one or the other obviously.
There is a core URL at (silly video) https://v.redd.it/3hyw7hwoajn21/DASHPlaylist.mpd
You can get the MP4 video only with audio at https://v.redd.it/3hyw7hwoajn21/DASH_720 and its corresponding audio track is at https://v.redd.it/3hyw7hwoajn21/audio
If I play the MP4 with the iOS SDK it works fine, but no audio:
let url = URL(string: "https://v.redd.it/3hyw7hwoajn21/DASH_720")!
let mediaInfoBuilder = GCKMediaInformationBuilder(contentURL: url)
mediaInfoBuilder.contentID = url.absoluteString
mediaInfoBuilder.streamType = .buffered
mediaInfoBuilder.streamDuration = TimeInterval(75)
mediaInfoBuilder.contentType = "video/mp4"
mediaInfoBuilder.metadata = metadata
let mediaInfo = mediaInfoBuilder.build()
So I try to add in the audio track before calling build()
, attempting to follow the documentation here:
mediaInfoBuilder.mediaTracks = [GCKMediaTrack(identifier: 98911, contentIdentifier: nil, contentType: "audio/mp4", type: GCKMediaTrackType.audio, textSubtype: GCKMediaTextTrackSubtype.unknown, name: "Fun time fun", languageCode: "en", customData: nil)]
But the result is the same: no audio.
Am I doing this wrong?
The audio and video streams have to be in the same manifest for us to support it if not then this is not supported by SDK. In general, the hardware of ChromeCast is limited to only allow one mediaElement. Some apps managed to add sound effect while reading a book, which might use WebAudio, but that's completely done in app.