首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >用AVAssetReader绘制波形

用AVAssetReader绘制波形
EN

Stack Overflow用户
提问于 2011-02-18 02:06:43
回答 3查看 44.8K关注 0票数 85

我使用assetUrl (代码名为audioUrl)从iPod库中读取歌曲,我可以用多种方式播放它,我可以剪切它,我可以用它进行一些处理,但是……我真的不明白我该怎么用这个CMSampleBufferRef来获取绘制波形的数据!我需要关于峰值的信息,我如何才能以这种(也许是另一种)方式获得它?

代码语言:javascript
复制
    AVAssetTrack * songTrack = [audioUrl.tracks objectAtIndex:0];
    AVAssetReaderTrackOutput * output = [[AVAssetReaderTrackOutput alloc] initWithTrack:songTrack outputSettings:nil];
    [reader addOutput:output];
    [output release];

    NSMutableData * fullSongData = [[NSMutableData alloc] init];
    [reader startReading];

    while (reader.status == AVAssetReaderStatusReading){

        AVAssetReaderTrackOutput * trackOutput = 
        (AVAssetReaderTrackOutput *)[reader.outputs objectAtIndex:0];

        CMSampleBufferRef sampleBufferRef = [trackOutput copyNextSampleBuffer];

        if (sampleBufferRef){/* what I gonna do with this? */}

请帮帮我!

EN

回答 3

Stack Overflow用户

发布于 2011-05-23 04:03:28

您应该能够从sampleBuffRef获取音频缓冲区,然后迭代这些值以构建波形:

代码语言:javascript
复制
CMBlockBufferRef buffer = CMSampleBufferGetDataBuffer( sampleBufferRef );
CMItemCount numSamplesInBuffer = CMSampleBufferGetNumSamples(sampleBufferRef);
AudioBufferList audioBufferList;
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
                                                            sampleBufferRef,
                                                            NULL,
                                                            &audioBufferList,
                                                            sizeof(audioBufferList),
                                                            NULL,
                                                            NULL,
                                                                  kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment,
                                                            &buffer
                                                            );

// this copies your audio out to a temp buffer but you should be able to iterate through this buffer instead
SInt32* readBuffer = (SInt32 *)malloc(numSamplesInBuffer * sizeof(SInt32));
memcpy( readBuffer, audioBufferList.mBuffers[0].mData, numSamplesInBuffer*sizeof(SInt32));
票数 5
EN

Stack Overflow用户

发布于 2019-03-27 22:04:33

另一种使用Swift 5并使用AVAudioFile的方法:

代码语言:javascript
复制
///Gets the audio file from an URL, downsaples and draws into the sound layer.
func drawSoundWave(fromURL url:URL, fromPosition:Int64, totalSeconds:UInt32, samplesSecond:CGFloat) throws{

    print("\(logClassName) Drawing sound from \(url)")

    do{
        waveViewInfo.samplesSeconds = samplesSecond

        //Get audio file and format from URL
        let audioFile = try AVAudioFile(forReading: url)

        waveViewInfo.format = audioFile.processingFormat
        audioFile.framePosition = fromPosition * Int64(waveViewInfo.format.sampleRate)

        //Getting the buffer
        let frameCapacity:UInt32 = totalSeconds * UInt32(waveViewInfo.format.sampleRate)

        guard let audioPCMBuffer = AVAudioPCMBuffer(pcmFormat: waveViewInfo.format, frameCapacity: frameCapacity) else{ throw AppError("Unable to get the AVAudioPCMBuffer") }
        try audioFile.read(into: audioPCMBuffer, frameCount: frameCapacity)
        let audioPCMBufferFloatValues:[Float] = Array(UnsafeBufferPointer(start: audioPCMBuffer.floatChannelData?.pointee,
                                                                          count: Int(audioPCMBuffer.frameLength)))

        waveViewInfo.points = []
        waveViewInfo.maxValue = 0
        for index in stride(from: 0, to: audioPCMBufferFloatValues.count, by: Int(audioFile.fileFormat.sampleRate) / Int(waveViewInfo.samplesSeconds)){

            let aSample = CGFloat(audioPCMBufferFloatValues[index])
            waveViewInfo.points.append(aSample)
            let fix = abs(aSample)
            if fix > waveViewInfo.maxValue{
                waveViewInfo.maxValue = fix
            }

        }

        print("\(logClassName) Finished the points - Count = \(waveViewInfo.points.count) / Max = \(waveViewInfo.maxValue)")

        populateSoundImageView(with: waveViewInfo)

    }
    catch{

        throw error

    }

}

///Converts the sound wave in to a UIImage
func populateSoundImageView(with waveViewInfo:WaveViewInfo){

    let imageSize:CGSize = CGSize(width: CGFloat(waveViewInfo.points.count),//CGFloat(waveViewInfo.points.count) * waveViewInfo.sampleSpace,
                                  height: frame.height)
    let drawingRect = CGRect(origin: .zero, size: imageSize)

    UIGraphicsBeginImageContextWithOptions(imageSize, false, 0)
    defer {
        UIGraphicsEndImageContext()
    }
    print("\(logClassName) Converting sound view in rect \(drawingRect)")

    guard let context:CGContext = UIGraphicsGetCurrentContext() else{ return }

    context.setFillColor(waveViewInfo.backgroundColor.cgColor)
    context.setAlpha(1.0)
    context.fill(drawingRect)
    context.setLineWidth(1.0)
    //        context.setLineWidth(waveViewInfo.lineWidth)

    let sampleAdjustFactor = imageSize.height / waveViewInfo.maxValue
    for pointIndex in waveViewInfo.points.indices{

        let pixel = waveViewInfo.points[pointIndex] * sampleAdjustFactor

        context.move(to: CGPoint(x: CGFloat(pointIndex), y: middleY - pixel))
        context.addLine(to: CGPoint(x: CGFloat(pointIndex), y: middleY + pixel))

        context.setStrokeColor(waveViewInfo.strokeColor.cgColor)
        context.strokePath()

    }

     //        for pointIndex in waveViewInfo.points.indices{
    //
    //            let pixel = waveViewInfo.points[pointIndex] * sampleAdjustFactor
    //
    //            context.move(to: CGPoint(x: CGFloat(pointIndex) * waveViewInfo.sampleSpace, y: middleY - pixel))
    //            context.addLine(to: CGPoint(x: CGFloat(pointIndex) * waveViewInfo.sampleSpace, y: middleY + pixel))
    //
    //            context.setStrokeColor(waveViewInfo.strokeColor.cgColor)
    //            context.strokePath()
    //
    //        }

    //        var xIncrement:CGFloat = 0
    //        for point in waveViewInfo.points{
    //
    //            let normalizedPoint = point * sampleAdjustFactor
    //
    //            context.move(to: CGPoint(x: xIncrement, y: middleY - normalizedPoint))
    //            context.addLine(to: CGPoint(x: xIncrement, y: middleX + normalizedPoint))
    //            context.setStrokeColor(waveViewInfo.strokeColor.cgColor)
    //            context.strokePath()
    //
    //            xIncrement += waveViewInfo.sampleSpace
    //
    //        }

    guard let soundWaveImage = UIGraphicsGetImageFromCurrentImageContext() else{ return }

    soundWaveImageView.image = soundWaveImage
    //        //In case of handling sample space in for
    //        updateWidthConstraintValue(soundWaveImage.size.width)
    updateWidthConstraintValue(soundWaveImage.size.width * waveViewInfo.sampleSpace)

}

哪里

代码语言:javascript
复制
class WaveViewInfo {

    var format:AVAudioFormat!
    var samplesSeconds:CGFloat = 50
    var lineWidth:CGFloat = 0.20
    var sampleSpace:CGFloat = 0.20

    var strokeColor:UIColor = .red
    var backgroundColor:UIColor = .clear

    var maxValue:CGFloat = 0
    var points:[CGFloat] = [CGFloat]()

}

目前只打印一个声波,但它可以扩展。好处是你可以按部分打印音轨。

票数 4
EN

Stack Overflow用户

发布于 2021-07-01 12:21:11

根据上面的答案进行一点重构(使用AVAudioFile)

代码语言:javascript
复制
import AVFoundation
import CoreGraphics
import Foundation
import UIKit

class WaveGenerator {
    private func readBuffer(_ audioUrl: URL) -> UnsafeBufferPointer<Float> {
        let file = try! AVAudioFile(forReading: audioUrl)

        let audioFormat = file.processingFormat
        let audioFrameCount = UInt32(file.length)
        guard let buffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: audioFrameCount)
        else { return UnsafeBufferPointer<Float>(_empty: ()) }
        do {
            try file.read(into: buffer)
        } catch {
            print(error)
        }

//        let floatArray = Array(UnsafeBufferPointer(start: buffer.floatChannelData![0], count: Int(buffer.frameLength)))
        let floatArray = UnsafeBufferPointer(start: buffer.floatChannelData![0], count: Int(buffer.frameLength))

        return floatArray
    }

    private func generateWaveImage(
        _ samples: UnsafeBufferPointer<Float>,
        _ imageSize: CGSize,
        _ strokeColor: UIColor,
        _ backgroundColor: UIColor
    ) -> UIImage? {
        let drawingRect = CGRect(origin: .zero, size: imageSize)

        UIGraphicsBeginImageContextWithOptions(imageSize, false, 0)

        let middleY = imageSize.height / 2

        guard let context: CGContext = UIGraphicsGetCurrentContext() else { return nil }

        context.setFillColor(backgroundColor.cgColor)
        context.setAlpha(1.0)
        context.fill(drawingRect)
        context.setLineWidth(0.25)

        let max: CGFloat = CGFloat(samples.max() ?? 0)
        let heightNormalizationFactor = imageSize.height / max / 2
        let widthNormalizationFactor = imageSize.width / CGFloat(samples.count)
        for index in 0 ..< samples.count {
            let pixel = CGFloat(samples[index]) * heightNormalizationFactor

            let x = CGFloat(index) * widthNormalizationFactor

            context.move(to: CGPoint(x: x, y: middleY - pixel))
            context.addLine(to: CGPoint(x: x, y: middleY + pixel))

            context.setStrokeColor(strokeColor.cgColor)
            context.strokePath()
        }
        guard let soundWaveImage = UIGraphicsGetImageFromCurrentImageContext() else { return nil }

        UIGraphicsEndImageContext()
        return soundWaveImage
    }

    func generateWaveImage(from audioUrl: URL, in imageSize: CGSize) -> UIImage? {
        let samples = readBuffer(audioUrl)
        let img = generateWaveImage(samples, imageSize, UIColor.blue, UIColor.white)
        return img
    }
}

使用

代码语言:javascript
复制
let url = Bundle.main.url(forResource: "TEST1.mp3", withExtension: "")!
let img = waveGenerator.generateWaveImage(from: url, in: CGSize(width: 600, height: 200))
票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/5032775

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档