Search code examples
iosswiftscenekitarkitrealitykit

SceneKit support for HEVC with alpha channel


I am trying to use HEVC video file with alpha channel in SceneKit. Used two different video files, one from apple(puppet video in asset in this apple demo) and from another resource(video 2).

Sample code:

  guard let videoURL = Bundle.main.url(forResource: "puppets_with_alpha_hevc", withExtension: ".mov") else {
        print("Filed to load movie url")
        return
        
    }
    let videoNode = SKVideoNode(avPlayer: videoPlayer)
    let spriteKitScene = SKScene(size: CGSize(width: 360.0 / 2.0, height: 480.0 / 2.0))
    spriteKitScene.scaleMode = .aspectFit
    videoNode.position = CGPoint(x: spriteKitScene.size.width / 2.0, y: spriteKitScene.size.height / 2.0)
    videoNode.size = spriteKitScene.size
    spriteKitScene.addChild(videoNode)
    let material = SCNMaterial()
    material.diffuse.contents = spriteKitScene
    material.transparent.contents = spriteKitScene
    guard let plane = container.childNode(withName: "video", recursively: true) else { return }
    plane.geometry?.materials = [material];
    plane.scale = SCNVector3(x: Float(referenceImage.physicalSize.width), y: Float(referenceImage.physicalSize.height), z: 1.0)
    videoNode.play()

Video is played against a reference image in AR.

Getting this exception when trying to play puppet video in SceneKit

validateFunctionArguments:3838: failed assertion `Fragment Function(FastSingle_FragFunc): missing sampler binding at index 0 for u_texture_sampler[0].'

For the second video , app is not crashing but only black opaque background with audio is visible.

Using Xcode version: 14.2 and iOS version : 14.8


Solution

  • High Efficiency Video Coding with Alpha

    SceneKit version

    There's no need to use RGBA video with SpriteKit medium. You can use it directly with SceneKit.

    (Tested on iOS 14.7 and iOS 17.0)

    import UIKit
    import AVFoundation
    import SceneKit
    
    class GameViewController: UIViewController {
        
        var player: AVPlayer? = nil
    
        override func viewDidLoad() {
            super.viewDidLoad()
            
            let sceneView = self.view as! SCNView
            sceneView.scene = SCNScene()
            sceneView.backgroundColor = .black
            sceneView.isPlaying = true
            
            // HEVC (H.265) with a premultiplied Alpha channel
            if let url =  Bundle.main.url(forResource: "puppets", 
                                        withExtension: "mov") {
                player = AVPlayer(url: url)
                player?.play()
            }
            let model = SCNNode(geometry: SCNPlane(width: 0.3, height: 0.45))
            model.geometry?.materials[0].diffuse.contents = player
            sceneView.scene?.rootNode.addChildNode(model)
        }
    }
    

    enter image description here

    RealityKit version

    The implementation of video material in RealityKit under visionOS looks similar. Also in visionOS, you can use VideoPlayerComponent.

    import SwiftUI
    import AVKit
    import RealityKit
    
    struct ContentView: View {
        
        @State var player: AVPlayer?
        
        var body: some View {
            RealityView { content in               
                let url = Bundle.main.url(forResource: "puppets",
                                        withExtension: "mov")!
    
                player = AVPlayer(url: url)
                var material = VideoMaterial(avPlayer: player!)
                
                let mesh = MeshResource.generatePlane(width: 0.32,
                                                     height: 0.45)
                let model = ModelEntity(mesh: mesh,
                                   materials: [material])
                content.add(model)
                player?.play()
                player?.volume = 0.2
            }
        }
    }