I need to use MediaCodec
to decode Online Streamed Audio (ex. Shoutcast).
And the problem is that I have no info on the Stream's Format. The only thing I can get from the Response Headers
of the Stream is the MIME
/Content Type
.
MediaCodec
should be configured before invocation of MediaCodec::start()
. MediaFormat
object should be filled out somehow. But is it possible to make MediaCodec
to configure itself from the stream data?
Or what should I do?
People, if you think that the question is too broad, please, make a comment and let me know what exactly should I change. Just marking it "Too broad"
tells nothing about it.
Streamed media, audio or video, is typically streamed in a 'container' like mp4, avi, mp3 etc.
These containers will include header information which describes the individual streams within the container, including the codecs that they are encoded with.
If you are familiar with ffmpeg, you can use the associated probe tool, ffprobe, to look at an mp4 and view the streams. An example output for a video file is:
ffprobe version 3.3.1 Copyright (c) 2007-2017 the FFmpeg developers
built with llvm-gcc 4.2.1 (LLVM build 2336.11.00)
configuration: --prefix=/Volumes/Ramdisk/sw --enable-gpl --enable-pthreads --enable-version3 --enable-libspeex --enable-libvpx --disable-decoder=libvpx --enable-libmp3lame --enable-libtheora --enable-libvorbis --enable-libx264 --enable-avfilter --enable-libopencore_amrwb --enable-libopencore_amrnb --enable-filters --enable-libgsm --enable-libvidstab --enable-libx265 --disable-doc --arch=x86_64 --enable-runtime-cpudetect
libavutil 55. 58.100 / 55. 58.100
libavcodec 57. 89.100 / 57. 89.100
libavformat 57. 71.100 / 57. 71.100
libavdevice 57. 6.100 / 57. 6.100
libavfilter 6. 82.100 / 6. 82.100
libswscale 4. 6.100 / 4. 6.100
libswresample 2. 7.100 / 2. 7.100
libpostproc 54. 5.100 / 54. 5.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'BigBuckBunny_320x180.mp4':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: mp41
creation_time : 1970-01-01T00:00:00.000000Z
title : Big Buck Bunny
artist : Blender Foundation
composer : Blender Foundation
date : 2008
encoder : Lavf52.14.0
Duration: 00:09:56.46, start: 0.000000, bitrate: 867 kb/s
Stream #0:0(und): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 320x180 [SAR 1:1 DAR 16:9], 702 kb/s, 24 fps, 24 tbr, 24 tbn, 48 tbc (default)
Metadata:
creation_time : 1970-01-01T00:00:00.000000Z
handler_name : VideoHandler
Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 159 kb/s (default)
Metadata:
creation_time : 1970-01-01T00:00:00.000000Z
handler_name : SoundHandler
You can see both the audio and video codecs in their respective streams.
The easiest way to play an audio or video stream on Android is to use MediaPlayer as it will take care of looking at the container and selecting the correct codec etc: https://developer.android.com/reference/android/media/MediaPlayer.html
I am guessing that this does not meet your needs for some reason, so you will most likely want to use MediaExtractor and then MediaCodec.
MediaExtractor 'extracts' the track from the container so you can do whatever you want to do with it. There is a good example on the documentation page at the time of writing (https://developer.android.com/reference/android/media/MediaExtractor.html), reproduced here:
MediaExtractor extractor = new MediaExtractor();
extractor.setDataSource(...);
int numTracks = extractor.getTrackCount();
for (int i = 0; i < numTracks; ++i) {
MediaFormat format = extractor.getTrackFormat(i);
String mime = format.getString(MediaFormat.KEY_MIME);
if (weAreInterestedInThisTrack) {
extractor.selectTrack(i);
}
}
ByteBuffer inputBuffer = ByteBuffer.allocate(...)
while (extractor.readSampleData(inputBuffer, ...) >= 0) {
int trackIndex = extractor.getSampleTrackIndex();
long presentationTimeUs = extractor.getSampleTime();
...
extractor.advance();
}
extractor.release();
extractor = null;