首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >AVAsset旋转

AVAsset旋转
EN

Stack Overflow用户
提问于 2016-05-02 17:58:19
回答 1查看 858关注 0票数 0

这是一个关于SO的很好的文档问题,在使用AVAssetWriterAVComposition将它们写入文件之后,AVAssets会被轮换。还有一些解决方案,例如查看视频轨道变换,查看资源如何旋转,以便可以根据特定用例将其旋转到所需的方向。

然而,我想知道的是为什么会发生这种情况,以及是否有可能阻止这种情况的发生。我不仅遇到了编写自定义视频文件的问题,还遇到了使用CGImageDestination将视频转换为gif的问题,其中输出的gif看起来很棒,但它是旋转的。

为我将资产写入文件的代码提供一个快速参考点:

代码语言:javascript
复制
let destinationURL = url ?? NSURL(fileURLWithPath: "\(NSTemporaryDirectory())\(String.random()).mp4")
        if let writer = try? AVAssetWriter(URL: destinationURL, fileType: AVFileTypeMPEG4),
            videoTrack = self.asset.tracksWithMediaType(AVMediaTypeVideo).last,
            firstBuffer = buffers.first {
            let videoCompressionProps = [AVVideoAverageBitRateKey: videoTrack.estimatedDataRate]
            let outputSettings: [String: AnyObject] = [
                AVVideoCodecKey: AVVideoCodecH264,
                AVVideoWidthKey: width,
                AVVideoHeightKey: height,
                AVVideoCompressionPropertiesKey: videoCompressionProps
            ]
            let writerInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: outputSettings, sourceFormatHint: (videoTrack.formatDescriptions.last as! CMFormatDescription))
            writerInput.expectsMediaDataInRealTime = false

            let rotateTransform = CGAffineTransformMakeRotation(Utils.degreesToRadians(-90))
            writerInput.transform = CGAffineTransformScale(rotateTransform, -1, 1)

            let pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: writerInput, sourcePixelBufferAttributes: nil)
            writer.addInput(writerInput)
            writer.startWriting()
            writer.startSessionAtSourceTime(CMSampleBufferGetPresentationTimeStamp(firstBuffer))


            for (sample, newTimestamp) in Array(Zip2Sequence(buffers, timestamps)) {
                if let imageBuffer = CMSampleBufferGetImageBuffer(sample) {
                    while !writerInput.readyForMoreMediaData {
                        NSThread.sleepForTimeInterval(0.1)
                    }
                    pixelBufferAdaptor.appendPixelBuffer(imageBuffer, withPresentationTime: newTimestamp)
                }
            }
            writer.finishWritingWithCompletionHandler {
              // completion code
            }

正如您在上面看到的,一个简单的转换将输出的视频旋转回肖像。但是,如果我有一个风景视频,这个转换就不再起作用。正如我之前提到的,将视频转换为gif会对我的资源执行完全相同的90度旋转。

我的感受可以归结为这两个gif:

http://giphy.com/gifs/jon-stewart-why-lYKvaJ8EQTzCU

http://giphy.com/gifs/the-office-no-steve-carell-12XMGIWtrHBl5e

EN

回答 1

Stack Overflow用户

发布于 2016-05-02 19:30:50

代码语言:javascript
复制
i have also find same Problem then i changed rotated my video to 90'its works fine 

Here is solution 
//in videoorientation.h 

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
@interface videoorientationViewController : UIViewController
@property AVMutableComposition *mutableComposition;
@property AVMutableVideoComposition *mutableVideoComposition;
@property AVMutableAudioMix *mutableAudioMix;
@property AVAssetExportSession *exportSession;
- (void)performWithAsset : (NSURL *)moviename;
@end

In //viewcontroller.m 

- (void)performWithAsset : (NSURL *)moviename
{

    self.mutableComposition=nil;
    self.mutableVideoComposition=nil;
    self.mutableAudioMix=nil;

//    NSString* filename = [NSString stringWithFormat:@"temp1.mov"];
//    
//    NSLog(@"file name== %@",filename);
//    
//    [[NSUserDefaults standardUserDefaults]setObject:filename forKey:@"currentName"];
//    NSString* path = [NSTemporaryDirectory() stringByAppendingPathComponent:filename];

    // NSLog(@"file number %i",_currentFile);

    // NSURL* url = [NSURL fileURLWithPath:path];

    // NSString *videoURL = [[NSBundle mainBundle] pathForResource:@"Movie" ofType:@"m4v"];

    AVAsset *asset = [[AVURLAsset alloc] initWithURL:moviename options:nil];

    AVMutableVideoCompositionInstruction *instruction = nil;
    AVMutableVideoCompositionLayerInstruction *layerInstruction = nil;
    CGAffineTransform t1;
    CGAffineTransform t2;

    AVAssetTrack *assetVideoTrack = nil;
    AVAssetTrack *assetAudioTrack = nil;
    // Check if the asset contains video and audio tracks
    if ([[asset tracksWithMediaType:AVMediaTypeVideo] count] != 0) {
        assetVideoTrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
    }
    if ([[asset tracksWithMediaType:AVMediaTypeAudio] count] != 0) {
        assetAudioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0];
    }

    CMTime insertionPoint = kCMTimeZero;
    NSError *error = nil;


    // Step 1
    // Create a composition with the given asset and insert audio and video tracks into it from the asset
    if (!self.mutableComposition) {

        // Check whether a composition has already been created, i.e, some other tool has already been applied
        // Create a new composition
        self.mutableComposition = [AVMutableComposition composition];

        // Insert the video and audio tracks from AVAsset
        if (assetVideoTrack != nil) {
            AVMutableCompositionTrack *compositionVideoTrack = [self.mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
            [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetVideoTrack atTime:insertionPoint error:&error];
        }
        if (assetAudioTrack != nil) {
            AVMutableCompositionTrack *compositionAudioTrack = [self.mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
            [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetAudioTrack atTime:insertionPoint error:&error];
        }

    }


    // Step 2
    // Translate the composition to compensate the movement caused by rotation (since rotation would cause it to move out of frame)
    t1 = CGAffineTransformMakeTranslation(assetVideoTrack.naturalSize.height, 0.0);
        float width=assetVideoTrack.naturalSize.width;
    float height=assetVideoTrack.naturalSize.height;
    float toDiagonal=sqrt(width*width+height*height);
    float toDiagonalAngle = radiansToDegrees(acosf(width/toDiagonal));
    float toDiagonalAngle2=90-radiansToDegrees(acosf(width/toDiagonal));

    float toDiagonalAngleComple;
    float toDiagonalAngleComple2;
    float finalHeight = 0.0;
    float finalWidth = 0.0;

    float degrees=90;

    if(degrees>=0&&degrees<=90){

        toDiagonalAngleComple=toDiagonalAngle+degrees;
        toDiagonalAngleComple2=toDiagonalAngle2+degrees;

        finalHeight=ABS(toDiagonal*sinf(degreesToRadians(toDiagonalAngleComple)));
        finalWidth=ABS(toDiagonal*sinf(degreesToRadians(toDiagonalAngleComple2)));

        t1 = CGAffineTransformMakeTranslation(height*sinf(degreesToRadians(degrees)), 0.0);
    }
    else if(degrees>90&&degrees<=180){


        float degrees2 = degrees-90;

        toDiagonalAngleComple=toDiagonalAngle+degrees2;
        toDiagonalAngleComple2=toDiagonalAngle2+degrees2;

        finalHeight=ABS(toDiagonal*sinf(degreesToRadians(toDiagonalAngleComple2)));
        finalWidth=ABS(toDiagonal*sinf(degreesToRadians(toDiagonalAngleComple)));

        t1 = CGAffineTransformMakeTranslation(width*sinf(degreesToRadians(degrees2))+height*cosf(degreesToRadians(degrees2)), height*sinf(degreesToRadians(degrees2)));
    }
    else if(degrees>=-90&&degrees<0){

        float degrees2 = degrees-90;
        float degreesabs = ABS(degrees);

        toDiagonalAngleComple=toDiagonalAngle+degrees2;
        toDiagonalAngleComple2=toDiagonalAngle2+degrees2;

        finalHeight=ABS(toDiagonal*sinf(degreesToRadians(toDiagonalAngleComple2)));
        finalWidth=ABS(toDiagonal*sinf(degreesToRadians(toDiagonalAngleComple)));

        t1 = CGAffineTransformMakeTranslation(0, width*sinf(degreesToRadians(degreesabs)));

    }
    else if(degrees>=-180&&degrees<-90){

        float degreesabs = ABS(degrees);
        float degreesplus = degreesabs-90;

        toDiagonalAngleComple=toDiagonalAngle+degrees;
        toDiagonalAngleComple2=toDiagonalAngle2+degrees;

        finalHeight=ABS(toDiagonal*sinf(degreesToRadians(toDiagonalAngleComple)));
        finalWidth=ABS(toDiagonal*sinf(degreesToRadians(toDiagonalAngleComple2)));

        t1 = CGAffineTransformMakeTranslation(width*sinf(degreesToRadians(degreesplus)), height*sinf(degreesToRadians(degreesplus))+width*cosf(degreesToRadians(degreesplus)));

    }


    // Rotate transformation
    t2 = CGAffineTransformRotate(t1, degreesToRadians(degrees));
    //t2 = CGAffineTransformRotate(t1, -90);


    // Step 3
    // Set the appropriate render sizes and rotational transforms
    if (!self.mutableVideoComposition) {

        // Create a new video composition
        self.mutableVideoComposition = [AVMutableVideoComposition videoComposition];
        // self.mutableVideoComposition.renderSize = CGSizeMake(assetVideoTrack.naturalSize.height,assetVideoTrack.naturalSize.width);
        self.mutableVideoComposition.renderSize = CGSizeMake(finalWidth,finalHeight);

        self.mutableVideoComposition.frameDuration = CMTimeMake(1,30);

        // The rotate transform is set on a layer instruction
        instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
        instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [self.mutableComposition duration]);
        layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:(self.mutableComposition.tracks)[0]];
        [layerInstruction setTransform:t2 atTime:kCMTimeZero];

    } else {

        self.mutableVideoComposition.renderSize = CGSizeMake(self.mutableVideoComposition.renderSize.height, self.mutableVideoComposition.renderSize.width);

        // Extract the existing layer instruction on the mutableVideoComposition
        instruction = (self.mutableVideoComposition.instructions)[0];
        layerInstruction = (instruction.layerInstructions)[0];

        // Check if a transform already exists on this layer instruction, this is done to add the current transform on top of previous edits
        CGAffineTransform existingTransform;

        if (![layerInstruction getTransformRampForTime:[self.mutableComposition duration] startTransform:&existingTransform endTransform:NULL timeRange:NULL]) {
            [layerInstruction setTransform:t2 atTime:kCMTimeZero];
        } else {
            // Note: the point of origin for rotation is the upper left corner of the composition, t3 is to compensate for origin
            CGAffineTransform t3 = CGAffineTransformMakeTranslation(-1*assetVideoTrack.naturalSize.height/2, 0.0);
            CGAffineTransform newTransform = CGAffineTransformConcat(existingTransform, CGAffineTransformConcat(t2, t3));
            [layerInstruction setTransform:newTransform atTime:kCMTimeZero];
        }

    }


    // Step 4
    // Add the transform instructions to the video composition
    instruction.layerInstructions = @[layerInstruction];
    self.mutableVideoComposition.instructions = @[instruction];


    // Step 5
    // Notify AVSEViewController about rotation operation completion
    // [[NSNotificationCenter defaultCenter] postNotificationName:AVSEEditCommandCompletionNotification object:self];

    [self performWithAssetExport];
}

- (void)performWithAssetExport
{
    // Step 1
    // Create an outputURL to which the exported movie will be saved

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *outputURL = paths[0];
    NSFileManager *manager = [NSFileManager defaultManager];
    [manager createDirectoryAtPath:outputURL withIntermediateDirectories:YES attributes:nil error:nil];
    outputURL = [outputURL stringByAppendingPathComponent:@"output.mov"];
    // Remove Existing File
    [manager removeItemAtPath:outputURL error:nil];


    // Step 2
    // Create an export session with the composition and write the exported movie to the photo library
    self.exportSession = [[AVAssetExportSession alloc] initWithAsset:[self.mutableComposition copy] presetName:AVAssetExportPreset1280x720];

    self.exportSession.videoComposition = self.mutableVideoComposition;
    self.exportSession.audioMix = self.mutableAudioMix;
    self.exportSession.outputURL = [NSURL fileURLWithPath:outputURL];
    self.exportSession.outputFileType=AVFileTypeQuickTimeMovie;


    [self.exportSession exportAsynchronouslyWithCompletionHandler:^(void){
        switch (self.exportSession.status) {
            case AVAssetExportSessionStatusCompleted:

                //[self playfunction];

                [[NSNotificationCenter defaultCenter]postNotificationName:@"Backhome" object:nil];



                // Step 3
                // Notify AVSEViewController about export completion
                break;
            case AVAssetExportSessionStatusFailed:
                NSLog(@"Failed:%@",self.exportSession.error);
                break;
            case AVAssetExportSessionStatusCancelled:
                NSLog(@"Canceled:%@",self.exportSession.error);
                break;
            default:
                break;
        }
    }];



}
票数 -3
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/36979692

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档