寻找一些关于将这个objective类方法移植到JS/nativescript的帮助。我尝试过的每一个变体都产生了一个TypeError: undefined is not a function...
https://developer.apple.com/documentation/avfoundation/avvideocomposition/1389556-init
我试着用JS写成:
const videoComp = AVVideoComposition.alloc().initWithAssetApplyingCIFiltersWithHandler(asset, (request) => { ... });
//OR
const videoComp = AVVideoComposition.alloc().initAssetApplyingCIFiltersWithHandler(asset, (request) => { ... });
//OR
const videoComp = AVVideoComposition.alloc().initAssetApplyingCIFiltersWithHandlerApplier(asset, (request) => { ... });
//OR
const videoComp = new AVVideoComposition(asset, (request) => { ... });举几个例子。实际上,我试图将此代码移植到nativescript/JS:
let blurRadius = 6.0
let asset = AVAsset(url: streamURL)
let item = AVPlayerItem(asset: asset)
item.videoComposition= AVVideoComposition(asset: asset) { request in
let blurred = request.sourceImage.clampedToExtent().applyingGaussianBlur(sigma: blurRadius)
let output = blurred.clampedToRect(request.sourceImage.extent)
request.finish(with: output, context: nil)
}在这篇博客文章中可以找到:https://willowtreeapps.com/ideas/how-to-apply-a-filter-to-a-video-stream-in-ios
发布于 2019-09-24 17:42:58
JavaScript /类型记录应该是这样的,
let blurRadius = 6.0;
let asset = AVAsset.assetWithURL(streamURL);
let item = AVPlayerItem.alloc().initWithAsset(asset);
item.videoComposition = AVVideoComposition.videoCompositionWithAssetApplyingCIFiltersWithHandler(asset, request => {
let blurred = request.sourceImage.imageByClampingToExtent().imageByApplyingGaussianBlurWithSigma(blurRadius);
let output = blurred.imageByClampingToRect(request.sourceImage.extent);
request.finishWithImageContext(output, null);
});注释:代码未经测试,仅仅是给定本机代码的翻译。利用tns-平台-声明对IntelliSense的支持。
https://stackoverflow.com/questions/58082795
复制相似问题