我正在尝试执行用于音频和视频连接的CMSampleBufferRef的深层拷贝?我需要使用这个缓冲区来延迟处理。有没有人可以在这里通过指向示例代码来帮助。
谢谢
发布于 2016-10-15 11:13:46
我解决了这个问题
我需要访问很长一段时间的样本数据。
尝试多种方式:
CVPixelBufferRetain ->程序坏了CVPixelBufferPool ->程序坏了CVPixelBufferCreateWithBytes ->可以解决这个程序,但是这样会降低性能,不建议苹果这样做
CMSampleBufferCreateCopy ->没问题,苹果推荐的。
列表:为了保持最佳性能,一些样本缓冲区直接引用内存池,这些内存池可能需要由设备系统和其他捕获输入重用。这通常是未压缩设备本机捕获的情况,其中复制的内存块尽可能少。如果多个样本缓冲区引用此类内存池的时间过长,则输入将无法再将新样本复制到内存中,并且这些样本将被丢弃。
参考:https://developer.apple.com/reference/avfoundation/avcapturefileoutputdelegate/1390096-captureoutput
这可能就是你所需要的:
pragma -captureOutput
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
if (connection == m_videoConnection) {
/* if you did not read m_sampleBuffer ,here you must CFRelease m_sampleBuffer, it is causing samples to be dropped
*/
if (m_sampleBuffer) {
CFRelease(m_sampleBuffer);
m_sampleBuffer = nil;
}
OSStatus status = CMSampleBufferCreateCopy(kCFAllocatorDefault, sampleBuffer, &m_sampleBuffer);
if (noErr != status) {
m_sampleBuffer = nil;
}
NSLog(@"m_sampleBuffer = %p sampleBuffer= %p",m_sampleBuffer,sampleBuffer);
}
}pragma mark -get CVPixelBufferRef长时间使用
- (ACResult) readVideoFrame: (CVPixelBufferRef *)pixelBuffer{
while (1) {
dispatch_sync(m_readVideoData, ^{
if (!m_sampleBuffer) {
_readDataSuccess = NO;
return;
}
CMSampleBufferRef sampleBufferCopy = nil;
OSStatus status = CMSampleBufferCreateCopy(kCFAllocatorDefault, m_sampleBuffer, &sampleBufferCopy);
if ( noErr == status)
{
CVPixelBufferRef buffer = CMSampleBufferGetImageBuffer(sampleBufferCopy);
*pixelBuffer = buffer;
_readDataSuccess = YES;
NSLog(@"m_sampleBuffer = %p ",m_sampleBuffer);
CFRelease(m_sampleBuffer);
m_sampleBuffer = nil;
}
else{
_readDataSuccess = NO;
CFRelease(m_sampleBuffer);
m_sampleBuffer = nil;
}
});
if (_readDataSuccess) {
_readDataSuccess = NO;
return ACResultNoErr;
}
else{
usleep(15*1000);
continue;
}
}
}然后你可以这样使用它:
-(void)getCaptureVideoDataToEncode{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^(){
while (1) {
CVPixelBufferRef buffer = NULL;
ACResult result= [videoCapture readVideoFrame:&buffer];
if (ACResultNoErr == result) {
ACResult error = [videoEncode encoder:buffer outputPacket:&streamPacket];
if (buffer) {
CVPixelBufferRelease(buffer);
buffer = NULL;
}
if (ACResultNoErr == error) {
NSLog(@"encode success");
}
}
}
});
} 发布于 2016-10-14 11:21:47
我这样做。CMSampleBufferCreateCopy确实可以进行深度复制,但出现了一个新问题,即captureOutput委托不起作用
https://stackoverflow.com/questions/34972377
复制相似问题