首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >ChromaKey在ARKit中的视频

ChromaKey在ARKit中的视频
EN

Stack Overflow用户
提问于 2018-04-21 21:13:52
回答 2查看 2.1K关注 0票数 3

我试图在ARKit中输入一个视频,我所做的与@Felix在这里所做的非常相似:GPUImageView inside SKScene as SKNode material - Playing transparent video on ARKit

但是,当视频应该出现时(在这种情况下,当检测到AR参考图像时),我会得到一个[SceneKit] Error: Cannot get pixel buffer (CVPixelBufferRef)错误,视频就不再播放了。在我实现chromaKeyMaterial之前,它确实发挥了作用。下面是我的代码,从检测到AR引用图像之后开始:

代码语言:javascript
复制
DispatchQueue.main.async {
let filePath = Bundle.main.path(forResource: "wigz", ofType: "mp4")
let videoURL = NSURL(fileURLWithPath: filePath!)
let player = AVPlayer(url: videoURL as URL)

let spriteKitScene = SKScene(size: CGSize(width: 640, height: 480))
let videoSpriteKitNode = SKVideoNode(avPlayer: player)
let videoNode = SCNNode()
videoNode.geometry = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width,
              height: imageAnchor.referenceImage.physicalSize.height)
videoNode.eulerAngles = SCNVector3(-Float.pi/2, 0, 0)

// Use spritekit with videonode inside
spriteKitScene.scaleMode = .aspectFit
videoSpriteKitNode.position = CGPoint(x: spriteKitScene.size.width / 2,
                      y: spriteKitScene.size.height / 2)
videoSpriteKitNode.size = spriteKitScene.size
videoSpriteKitNode.yScale = -1.0
videoSpriteKitNode.play()

// Loop video
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: player.currentItem, queue: .main) { _ in
    player.seek(to: kCMTimeZero)
    player.play()
}

spriteKitScene.addChild(videoSpriteKitNode)

videoNode.geometry?.firstMaterial?.diffuse.contents = spriteKitScene
videoNode.geometry?.firstMaterial?.isDoubleSided = true
let chromaKeyMaterial = ChromaKeyMaterial()
chromaKeyMaterial.diffuse.contents = player
videoNode.geometry!.materials = [chromaKeyMaterial]

node.addChildNode(videoNode)

self.imageDetectView.scene.rootNode.addChildNode(node)
}

在ChromaKeyMaterial.swift文件中,我将这些行更改为:

代码语言:javascript
复制
float maskY = 0.0 * c_colorToReplace.r + 1.0 * c_colorToReplace.g + 0.0 * c_colorToReplace.b;
float maskCr = 0.7132 * (c_colorToReplace.r - maskY);
float maskCb = 0.5647 * (c_colorToReplace.b - maskY);

float Y = 0.0 * textureColor.r + 1.0 * textureColor.g + 0.0 * textureColor.b;
float Cr = 0.7132 * (textureColor.r - Y);
float Cb = 0.5647 * (textureColor.b - Y);

在一次努力中,用色度键去掉了纯绿色,但我不确定这是否是正确的方法。

任何帮助都将不胜感激!

EN

回答 2

Stack Overflow用户

回答已采纳

发布于 2018-04-22 13:11:35

弄明白了。我把我的颜色设置为不正确(即使是在错误的地方正面),似乎有一个错误,阻止了视频播放,除非你延迟了一点。那个窃听器应该是固定的,但似乎不是这样的。

如果有人感兴趣的话,这里是我修改和清理的代码(编辑后包含来自@mnuages的提示):

代码语言:javascript
复制
// Get Video URL and create AV Player
let filePath = Bundle.main.path(forResource: "VIDEO_FILE_NAME", ofType: "VIDEO_FILE_EXTENSION")
let videoURL = NSURL(fileURLWithPath: filePath!)
let player = AVPlayer(url: videoURL as URL)

// Create SceneKit videoNode to hold the spritekit scene.
let videoNode = SCNNode()

// Set geometry of the SceneKit node to be a plane, and rotate it to be flat with the image
videoNode.geometry = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width,
              height: imageAnchor.referenceImage.physicalSize.height)
videoNode.eulerAngles = SCNVector3(-Float.pi/2, 0, 0)

//Set the video AVPlayer as the contents of the video node's material.
videoNode.geometry?.firstMaterial?.diffuse.contents = player
videoNode.geometry?.firstMaterial?.isDoubleSided = true

// Alpha transparancy stuff
let chromaKeyMaterial = ChromaKeyMaterial()
chromaKeyMaterial.diffuse.contents = player
videoNode.geometry!.materials = [chromaKeyMaterial]

//video does not start without delaying the player
//playing the video before just results in [SceneKit] Error: Cannot get pixel buffer (CVPixelBufferRef)
DispatchQueue.main.asyncAfter(deadline: .now() + 0.001) {
    player.seek(to:CMTimeMakeWithSeconds(1, 1000))
    player.play()
}
// Loop video
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: player.currentItem, queue: .main) { _ in
    player.seek(to: kCMTimeZero)
    player.play()
}

// Add videoNode to ARAnchor
node.addChildNode(videoNode)

// Add ARAnchor node to the root node of the scene
self.imageDetectView.scene.rootNode.addChildNode(node)

这是铬基材料

代码语言:javascript
复制
import SceneKit

public class ChromaKeyMaterial: SCNMaterial {

public var backgroundColor: UIColor {
    didSet { didSetBackgroundColor() }
}

public var thresholdSensitivity: Float {
    didSet { didSetThresholdSensitivity() }
}

public var smoothing: Float  {
    didSet { didSetSmoothing() }
}

public init(backgroundColor: UIColor = .green, thresholdSensitivity: Float = 0.50, smoothing: Float = 0.001) {

    self.backgroundColor = backgroundColor
    self.thresholdSensitivity = thresholdSensitivity
    self.smoothing = smoothing

    super.init()

    didSetBackgroundColor()
    didSetThresholdSensitivity()
    didSetSmoothing()

    // chroma key shader is based on GPUImage
    // https://github.com/BradLarson/GPUImage/blob/master/framework/Source/GPUImageChromaKeyFilter.m

    let surfaceShader =
    """
uniform vec3 c_colorToReplace;
uniform float c_thresholdSensitivity;
uniform float c_smoothing;

#pragma transparent
#pragma body

vec3 textureColor = _surface.diffuse.rgb;

float maskY = 0.2989 * c_colorToReplace.r + 0.5866 * c_colorToReplace.g + 0.1145 * c_colorToReplace.b;
float maskCr = 0.7132 * (c_colorToReplace.r - maskY);
float maskCb = 0.5647 * (c_colorToReplace.b - maskY);

float Y = 0.2989 * textureColor.r + 0.5866 * textureColor.g + 0.1145 * textureColor.b;
float Cr = 0.7132 * (textureColor.r - Y);
float Cb = 0.5647 * (textureColor.b - Y);

float blendValue = smoothstep(c_thresholdSensitivity, c_thresholdSensitivity + c_smoothing, distance(vec2(Cr, Cb), vec2(maskCr, maskCb)));

float a = blendValue;
_surface.transparent.a = a;
"""

    //_surface.transparent.a = a;

    shaderModifiers = [
        .surface: surfaceShader,
    ]
}

required public init?(coder aDecoder: NSCoder) {
    fatalError("init(coder:) has not been implemented")
}

//setting background color to be keyed out
private func didSetBackgroundColor() {
    //getting pixel from background color
    //let rgb = backgroundColor.cgColor.components!.map{Float($0)}
    //let vector = SCNVector3(x: rgb[0], y: rgb[1], z: rgb[2])
    let vector = SCNVector3(x: 0.0, y: 1.0, z: 0.0)
    setValue(vector, forKey: "c_colorToReplace")
}

private func didSetSmoothing() {
    setValue(smoothing, forKey: "c_smoothing")
}

private func didSetThresholdSensitivity() {
    setValue(thresholdSensitivity, forKey: "c_thresholdSensitivity")
}
}
票数 5
EN

Stack Overflow用户

发布于 2021-11-25 11:46:01

使用RealityKit 2- iOS14

我相信,使用RealityKit,您将需要使用金属,以创造一个色度着色器。

我还不太了解金属,也不能说如何创建它,但我找到了另一种方法,在AR播放色度键视频与RealityKit。

从iOS14可以使用视频材料作为ModelEntity的纹理。

对于色度,还需要一些额外的步骤:

  • 首先,我们需要转换视频资产和删除色度键。
  • 然后我们在播放器中加载这个资产,并使用modelEntity (iOS 14)的新的视频材料属性。

我们开始进口这个令人难以置信的包由宇澳。

https://github.com/MetalPetal/MetalPetal/issues/289

不要忘记导入包:import MetalPetal

这是代码:

代码语言:javascript
复制
// in the viewmodel you process the asset and create the player
let context = try! MTIContext(device: MTLCreateSystemDefaultDevice()!)
let chromaKeyBlendFilter = MTIChromaKeyBlendFilter()
let color = MTIColor(red: 0.998, green: 0.0, blue: 0.996, alpha: 1)
//let backgroundColor = MTLClearColor(red: 0, green: 0, blue: 0, alpha: 0)
let backgroundColor = MTIColor(red: 0.0, green: 0.0, blue: 0, alpha: 0)
chromaKeyBlendFilter.color = color
chromaKeyBlendFilter.smoothing = 0.001
chromaKeyBlendFilter.thresholdSensitivity = 0.4//0.475
chromaKeyBlendFilter.inputBackgroundImage = MTIImage(color: backgroundColor, sRGB: false, size: videoSize)
let composition = MTIVideoComposition(asset: asset, context: context, queue: DispatchQueue.main, filter: { request in
    guard let sourceImage = request.anySourceImage else {
        return MTIImage(color: backgroundColor, sRGB: false, size: videoSize)
    }
    return FilterGraph.makeImage(builder: { output in
        sourceImage => chromaKeyBlendFilter.inputPorts.inputImage
        chromaKeyBlendFilter => output
    })!
})

videoPlayerItem = AVPlayerItem(asset: asset)
videoPlayerItem.videoComposition = composition.makeAVVideoComposition()

let player = AVPlayer(playerItem: videoPlayerItem)
player.volume = 0.5
// player.play()

我们可以使用RealityKit 2.0中的视频纹理(Xcode 12和iOS 14)。See this answer by Andy Jazz here about how to set it up

票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/49960262

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档