Search code examples
gstreamerhttp-live-streaming

adding audio to appsrc video pipeline


I'm using appsrc to generate an HLS stream, this is my successful pipeline:

appsrc->videoconvert->openh264enc->h264parse->mpegtsmux->hlssink

However, I'd like to generate some audio via audiotestsrc before mpegtsmux which would look like the following:

audiotestsrc->lamemp3enc->mpegtsmux

Audiotestsrc and lame have 'always' pads, so I link the two just like my other video elements.

When it comes to linking lame's "always" "src", to mpegtsmux's "request" "sink_%d", the result says that there's no issue:

//Returns 0
gst_pad_link(h264ParsePad, mpegtsmuxSinkPad);

//Returns 0
gst_pad_link(audioEncPad, mpegtsmuxSinkPad);

//Returns 0
gst_pad_link(mpegtsmuxSrcPad, hlssinkPad);

But running the app results in pipeline failure with

"Internal data stream error."

Removing the audioEncPad linking just makes the stream work like normal but of course without audio. How should I go about doing this?


Solution

  • Few things needed to be done:

    1. Use aacparse

    2. Clean the solution

    3. Link voaacenc with aacparse

    #2 caused me a lot of torment since everything theoretically should've worked. D'oh.