首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >Swift标题中的audioSettings属性在AVCaptureAudioDataOutput中丢失了吗?

Swift标题中的audioSettings属性在AVCaptureAudioDataOutput中丢失了吗?
EN

Stack Overflow用户
提问于 2016-12-14 21:25:42
回答 1查看 2K关注 0票数 5

我正在iOS应用程序上工作,我想在那里录制分段视频。我读过Introduction.html,我有一个使用AVCaptureVideoDataOutput的工作解决方案,在这里我捕获框架并使用AVAssetWriter将它们写到文件中。我将AVCaptureVideoDataOutput添加到AVCaptureSession中,如下所示:

代码语言:javascript
复制
// Setup videoDataOutput in order to capture samplebuffers
let videoDataOutput = AVCaptureVideoDataOutput()
videoDataOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey as AnyHashable : Int(kCVPixelFormatType_32BGRA)]
videoDataOutput.alwaysDiscardsLateVideoFrames = true
videoDataOutput.setSampleBufferDelegate(self, queue: CaptureManager.CAPTURE_QUEUE)

guard captureSession.canAddOutput(videoDataOutput) else {
    return
}

captureSession.addOutput(videoDataOutput)
self.videoDataOutput = videoDataOutput

这样做很好,我可以成功地运行捕获会话,并获得一个可播放的电影文件。

现在我想连接音频。所以我想做同样的事情:

代码语言:javascript
复制
// Setup audioDataOutput in order to capture audio
let audioDataOutput = AVCaptureAudioDataOutput()
audioDataOutput.audioSettings = ...
audioDataOutput.setSampleBufferDelegate(self, queue: CaptureManager.CAPTURE_QUEUE)

guard captureSession.canAddOutput(audioDataOutput) else {
    return
}

captureSession.addOutput(audioDataOutput)
self.audioDataOutput = audioDataOutput

疯狂的是,AVCaptureAudioDataOutput上没有财产 audioSettings!文档中这样说:https://developer.apple.com/reference/avfoundation/avcaptureaudiodataoutput/1388527-audiosettings,但是Swift头没有这样的成员(下面)。

到底是怎么回事?我正在使用XCode 8.1。AVCaptureAudioDataOutput类的Swift报头如下:

代码语言:javascript
复制
import AVFoundation
import CoreMedia
import Foundation


/*!
 @class AVCaptureAudioDataOutput
 @abstract
 AVCaptureAudioDataOutput is a concrete subclass of AVCaptureOutput that can be used to process uncompressed or compressed samples from the audio being captured.

 @discussion
 Instances of AVCaptureAudioDataOutput produce audio sample buffers suitable for processing using other media APIs. Applications can access the sample buffers with the captureOutput:didOutputSampleBuffer:fromConnection: delegate method.
 */
@available(iOS 4.0, *)
open class AVCaptureAudioDataOutput : AVCaptureOutput {


    /*!
     @method setSampleBufferDelegate:queue:
     @abstract
     Sets the receiver's delegate that will accept captured buffers and dispatch queue on which the delegate will be called.

     @param sampleBufferDelegate
     An object conforming to the AVCaptureAudioDataOutputSampleBufferDelegate protocol that will receive sample buffers after they are captured.
     @param sampleBufferCallbackQueue
     A dispatch queue on which all sample buffer delegate methods will be called.

     @discussion
     When a new audio sample buffer is captured it will be vended to the sample buffer delegate using the captureOutput:didOutputSampleBuffer:fromConnection: delegate method. All delegate methods will be called on the specified dispatch queue. If the queue is blocked when new samples are captured, those samples will be automatically dropped when they become sufficiently late. This allows clients to process existing samples on the same queue without having to manage the potential memory usage increases that would otherwise occur when that processing is unable to keep up with the rate of incoming samples.

     Clients that need to minimize the chances of samples being dropped should specify a queue on which a sufficiently small amount of processing is being done outside of receiving sample buffers. However, if such clients migrate extra processing to another queue, they are responsible for ensuring that memory usage does not grow without bound from samples that have not been processed.

     A serial dispatch queue must be used to guarantee that audio samples will be delivered in order. The sampleBufferCallbackQueue parameter may not be NULL, except when setting sampleBufferDelegate to nil.
     */
    open func setSampleBufferDelegate(_ sampleBufferDelegate: AVCaptureAudioDataOutputSampleBufferDelegate!, queue sampleBufferCallbackQueue: DispatchQueue!)


    /*!
     @property sampleBufferDelegate
     @abstract
     The receiver's delegate.

     @discussion
     The value of this property is an object conforming to the AVCaptureAudioDataOutputSampleBufferDelegate protocol that will receive sample buffers after they are captured. The delegate is set using the setSampleBufferDelegate:queue: method.
     */
    open var sampleBufferDelegate: AVCaptureAudioDataOutputSampleBufferDelegate! { get }


    /*!
     @property sampleBufferCallbackQueue
     @abstract
     The dispatch queue on which all sample buffer delegate methods will be called.

     @discussion
     The value of this property is a dispatch_queue_t. The queue is set using the setSampleBufferDelegate:queue: method.
     */
    open var sampleBufferCallbackQueue: DispatchQueue! { get }


    /*!
     @property audioSettings
     @abstract
     Specifies the settings used to decode or re-encode audio before it is output by the receiver.

     @discussion
     The value of this property is an NSDictionary containing values for audio settings keys defined  in AVAudioSettings.h. When audioSettings is set to nil, the AVCaptureAudioDataOutput vends samples in their device native format.
     */

    // (TARGET_OS_MAC && !(TARGET_OS_EMBEDDED || TARGET_OS_IPHONE))

    /*!
     @method recommendedAudioSettingsForAssetWriterWithOutputFileType:
     @abstract
     Specifies the recommended settings for use with an AVAssetWriterInput.

     @param outputFileType
     Specifies the UTI of the file type to be written (see AVMediaFormat.h for a list of file format UTIs).

     @return
     A fully populated dictionary of keys and values that are compatible with AVAssetWriter.

     @discussion
     The value of this property is an NSDictionary containing values for compression settings keys defined in AVAudioSettings.h. This dictionary is suitable for use as the "outputSettings" parameter when creating an AVAssetWriterInput, such as,

     [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:outputSettings sourceFormatHint:hint];

     The dictionary returned contains all necessary keys and values needed by AVAssetWriter (see AVAssetWriterInput.h, -initWithMediaType:outputSettings: for a more in depth discussion). For QuickTime movie and ISO files, the recommended audio settings will always produce output comparable to that of AVCaptureMovieFileOutput.

     Note that the dictionary of settings is dependent on the current configuration of the receiver's AVCaptureSession and its inputs. The settings dictionary may change if the session's configuration changes. As such, you should configure your session first, then query the recommended audio settings.
     */
    @available(iOS 7.0, *)
    open func recommendedAudioSettingsForAssetWriter(withOutputFileType outputFileType: String!) -> [AnyHashable : Any]!
}


/*!
 @protocol AVCaptureAudioDataOutputSampleBufferDelegate
 @abstract
 Defines an interface for delegates of AVCaptureAudioDataOutput to receive captured audio sample buffers.
 */
public protocol AVCaptureAudioDataOutputSampleBufferDelegate : NSObjectProtocol {


    /*!
     @method captureOutput:didOutputSampleBuffer:fromConnection:
     @abstract
     Called whenever an AVCaptureAudioDataOutput instance outputs a new audio sample buffer.

     @param captureOutput
     The AVCaptureAudioDataOutput instance that output the samples.
     @param sampleBuffer
     A CMSampleBuffer object containing the audio samples and additional information about them, such as their format and presentation time.
     @param connection
     The AVCaptureConnection from which the audio was received.

     @discussion
     Delegates receive this message whenever the output captures and outputs new audio samples, decoding or re-encoding as specified by the audioSettings property. Delegates can use the provided sample buffer in conjunction with other APIs for further processing. This method will be called on the dispatch queue specified by the output's sampleBufferCallbackQueue property. This method is called periodically, so it must be efficient to prevent capture performance problems, including dropped audio samples.

     Clients that need to reference the CMSampleBuffer object outside of the scope of this method must CFRetain it and then CFRelease it when they are finished with it.
     */
    @available(iOS 4.0, *)
    optional public func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!)
}
EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2016-12-15 00:01:06

AVCaptureAudioDataOutput.audioSettings只能在osx上使用。您可能可以使用AVAudioSession修改示例速率,但否则您将不得不安排您想要进行的任何转换。

有很多方法可以做到这一点,但是outputSettings of AVAssetWriterInput.init(mediaType:, outputSettings:)似乎是一个很好的起点。

票数 5
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/41152448

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档