我正在尝试使用一个AVCaptureSession并将其编码到mp4。这看起来应该很简单,我正在尝试编码一个960x540视频流,我并不担心这个问题的音频。
当我运行以下代码并使用Xcode从文档容器中抓取out2.mp4时,很快就会看到一个黑色屏幕,持续时间为46小时。至少决议看起来是对的。这是来自ffmpeg -i out2.mp4的输出
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'out2.mp4':
Metadata:
major_brand : mp42
minor_version : 1
compatible_brands: mp41mp42isom
creation_time : 2015-11-18 01:25:55
Duration: 46:43:04.21, start: 168178.671667, bitrate: 0 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, smpte170m/bt709/bt709), 960x540, 1860 kb/s, 27.65 fps, 29.97 tbr, 600 tbn, 1200 tbc (default)
Metadata:
creation_time : 2015-11-18 01:25:55
handler_name : Core Media Video为什么我不能在这个场景中将示例缓冲区附加到AVAssetWriterInput中?
var videoInput: AVAssetWriterInput?
var assetWriter: AVAssetWriter?
override func viewDidLoad() {
super.viewDidLoad()
self.startStream()
NSTimer.scheduledTimerWithTimeInterval(5, target: self, selector: "swapSegment", userInfo: nil, repeats: false)
}
func swapSegment() {
assetWriter?.finishWritingWithCompletionHandler(){
print("File written")
}
videoInput = nil
}
func pathForOutput() -> String {
let urls = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)
if let documentDirectory: NSURL = urls.first {
let fileUrl = documentDirectory.URLByAppendingPathComponent("out1.mp4")
return fileUrl.path!
}
return ""
}
func startStream() {
assetWriter = try! AVAssetWriter(URL: NSURL(fileURLWithPath: self.pathForOutput()), fileType: AVFileTypeMPEG4)
let videoSettings: [String: AnyObject] = [AVVideoCodecKey: AVVideoCodecH264, AVVideoWidthKey: 960, AVVideoHeightKey: 540]
videoInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoSettings)
videoInput!.expectsMediaDataInRealTime = true
assetWriter?.addInput(videoInput!)
assetWriter!.startWriting()
assetWriter!.startSessionAtSourceTime(kCMTimeZero)
let videoHelper = VideoHelper()
videoHelper.delegate = self
videoHelper.startSession()
}
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBufferRef, fromConnection connection: AVCaptureConnection!) {
if let videoOutput = captureOutput as? AVCaptureVideoDataOutput {
videoInput?.appendSampleBuffer(sampleBuffer)
}
}发布于 2015-11-18 02:12:21
也许您的演示时间与您的sourceTime (kCMTimeZero)无关。您可以使用第一个缓冲区表示时间戳作为源时间。
附注:大概46小时左右是你的设备正常运行时间。
https://stackoverflow.com/questions/33769972
复制相似问题