Search code examples
iosstreamingrtmpvideocore

How to stream (RTMP) video from iOS Device (not from its own camera)


I want to send a external video from my iOS device. This video is being received from a live streaming: RTSP server or HLS url (not from iPhone camera).

Currently I can stream my camera video from iPhone using VideoCore (internally using CameraSource and MicSource) but now, the video I want to stream comes from an URL. Similar to Periscope streaming video from GoPro Cam.

Problem 1: I don't know how to extract from a RTSP URL audio and video

Problem 2: I don't know how to create a CameraSource o MicSource from this extracted video and audio.

Do you know where to find an example or could you help me with this technical challenge?


Solution

  • I found a first approach for the first problem:

      AVPlayerItem *item = [AVPlayerItem playerItemWithURL:URL];
        AVAsset *asset = [item asset];
    
        [asset loadValuesAsynchronouslyForKeys:@[@"tracks"] completionHandler:^{
            if ([asset statusOfValueForKey:@"tracks" error:nil] == AVKeyValueStatusLoaded) {
                NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
                NSArray *audioTracks = [asset tracksWithMediaType:AVMediaTypeAudio];
    
                //VIDEO
                //videoOutput is a AVPlayerItemVideoOutput * property
                [item addOutput:self.videoOutput];
    
                //AUDIO
                AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:[audioTracks objectAtIndex:0]];
                MTAudioProcessingTapCallbacks callbacks;            
                callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
                callbacks.clientInfo = (__bridge void *)self,
                callbacks.init = tap_InitCallback;
                callbacks.finalize = tap_FinalizeCallback;
                callbacks.prepare = tap_PrepareCallback;
                callbacks.unprepare = tap_UnprepareCallback;
                callbacks.process = tap_ProcessCallback;            
                MTAudioProcessingTapRef tap;
                OSStatus err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks,
                                                          kMTAudioProcessingTapCreationFlag_PostEffects, &tap);           
                inputParams.audioTapProcessor = tap;
                AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
                audioMix.inputParameters = @[inputParams];
                item.audioMix = audioMix;    
        }];
    

    Then create a callback with CADisplayLink which will callback displayPixelBuffer: at every vsync.

     self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(displayLinkCallback:)];
        [[self displayLink] addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
        [[self displayLink] setPaused:YES];
    

    and in this method get pixelBuffer and send to output For Audio, do similar tasks in prepare callback using AURenderCallbackStruct.