首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >iOS - AVAssestExportSession播放AVPlayer后最多只能导出8首曲目

iOS - AVAssestExportSession播放AVPlayer后最多只能导出8首曲目
EN

Stack Overflow用户
提问于 2016-02-18 15:25:23
回答 2查看 812关注 0票数 3

我正在尝试循环录制视频的一些片段,并将它们合并到一个视频中。我已经成功地合并并导出了一个最多16首曲目的乐曲。但是当我尝试在合并前使用AVPlayer播放乐曲时,我最多只能导出8首曲目。

首先,我创建了AVCompositionAVVideoComposition

代码语言:javascript
复制
    +(void)previewUserClipDanceWithAudio:(NSURL*)videoURL audioURL:(NSURL*)audioFile loop:(NSArray*)loopTime slowMotion:(NSArray*)slowFactor showInViewController:(UIViewController*)viewController completion:(void(^)(BOOL success, AVVideoComposition* videoComposition, AVComposition* composition))completion{

AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
NSMutableArray *arrayInstruction = [[NSMutableArray alloc] init];
AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];

AVURLAsset  *audioAsset = [[AVURLAsset alloc]initWithURL:audioFile options:nil];
//NSLog(@"audio File %@",audioFile);

CMTime duration = kCMTimeZero;

AVAsset *currentAsset = [AVAsset assetWithURL:videoURL];
BOOL  isCurrentAssetPortrait  = YES;

for(NSInteger i=0;i< [loopTime count]; i++) {

    //handle looptime array
    NSInteger loopDur = [[loopTime objectAtIndex:i] intValue];
    NSInteger value = labs(loopDur);
    //NSLog(@"loopInfo %d value %d",loopInfo,value);
    //handle slowmotion array
    double slowInfo = [[slowFactor objectAtIndex:i] doubleValue];
    double videoScaleFactor = fabs(slowInfo);

    AVMutableCompositionTrack *currentTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    AVMutableCompositionTrack *audioTrack;
    audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                             preferredTrackID:kCMPersistentTrackID_Invalid];
    if (i==0) {
        [currentTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, currentAsset.duration) ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:duration error:nil];

        [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, currentAsset.duration) ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:duration error:nil];

    } else {

        [currentTrack insertTimeRange:CMTimeRangeMake(CMTimeSubtract(currentAsset.duration, CMTimeMake(value, 10)), CMTimeMake(value, 10)) ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:duration error:nil];

        if (videoScaleFactor==1) {

            [audioTrack insertTimeRange:CMTimeRangeMake(CMTimeSubtract(currentAsset.duration, CMTimeMake(value, 10)), CMTimeMake(value, 10)) ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:duration error:nil];
        }
        //slow motion here
        if (videoScaleFactor!=1) {

            [currentTrack scaleTimeRange:CMTimeRangeMake(CMTimeSubtract(currentAsset.duration, CMTimeMake(value, 10)), CMTimeMake(value, 10))
                              toDuration:CMTimeMake(value*videoScaleFactor, 10)];
            NSLog(@"slowmo %f",value*videoScaleFactor);
        }
    }

    AVMutableVideoCompositionLayerInstruction *currentAssetLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:currentTrack];
    AVAssetTrack *currentAssetTrack = [[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

    BOOL  isCurrentAssetPortrait  = YES;
    //CGFloat assetScaleToFitRatio;
    //assetScaleToFitRatio = [self getScaleToFitRatioCurrentTrack:currentTrack];

    if(isCurrentAssetPortrait){
        //NSLog(@"portrait");
        if (slowInfo<0) {
            CGRect screenRect = [[UIScreen mainScreen] bounds];
            CGFloat ratio = screenRect.size.height / screenRect.size.width;

            // we have to adjust the ratio for 16:9 screens
            if (ratio == 1.775) ratio = 1.77777777777778;

            CGFloat complimentSize = (currentAssetTrack.naturalSize.height*ratio);
            CGFloat tx = (currentAssetTrack.naturalSize.width-complimentSize)/2;

            // invert translation because of portrait
            tx *= -1;
            // t1: rotate and position video since it may have been cropped to screen ratio
            CGAffineTransform t1 = CGAffineTransformTranslate(currentAssetTrack.preferredTransform, tx, 0);
            // t2/t3: mirror video vertically

            CGAffineTransform t2 = CGAffineTransformTranslate(t1, currentAssetTrack.naturalSize.width, 0);
            CGAffineTransform t3 = CGAffineTransformScale(t2, -1, 1);

            [currentAssetLayerInstruction setTransform:t3 atTime:duration];

        } else if (loopDur<0) {
            CGRect screenRect = [[UIScreen mainScreen] bounds];
            CGFloat ratio = screenRect.size.height / screenRect.size.width;

            // we have to adjust the ratio for 16:9 screens
            if (ratio == 1.775) ratio = 1.77777777777778;

            CGFloat complimentSize = (currentAssetTrack.naturalSize.height*ratio);
            CGFloat tx = (currentAssetTrack.naturalSize.width-complimentSize)/2;

            // invert translation because of portrait
            tx *= -1;
            // t1: rotate and position video since it may have been cropped to screen ratio
            CGAffineTransform t1 = CGAffineTransformTranslate(currentAssetTrack.preferredTransform, tx, 0);
            // t2/t3: mirror video horizontally
            CGAffineTransform t2 = CGAffineTransformTranslate(t1, 0, currentAssetTrack.naturalSize.height);
            CGAffineTransform t3 = CGAffineTransformScale(t2, 1, -1);

            [currentAssetLayerInstruction setTransform:t3 atTime:duration];

        } else {

            [currentAssetLayerInstruction setTransform:currentAssetTrack.preferredTransform atTime:duration];

        }
    }else{
        //            CGFloat translateAxisX = (currentTrack.naturalSize.width > MAX_WIDTH )?(0.0):0.0;// if use <, 640 video will be moved left by 10px. (float)(MAX_WIDTH - currentTrack.naturalSize.width)/(float)4.0
        //            CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(assetScaleToFitRatio,assetScaleToFitRatio);
        //            [currentAssetLayerInstruction setTransform:
        //             CGAffineTransformConcat(CGAffineTransformConcat(currentAssetTrack.preferredTransform, FirstAssetScaleFactor),CGAffineTransformMakeTranslation(translateAxisX, 0)) atTime:duration];
    }
    if (i==0) {
        duration=CMTimeAdd(duration, currentAsset.duration);
    } else  {
        if (videoScaleFactor!=1) {
            duration=CMTimeAdd(duration, CMTimeMake(value*videoScaleFactor, 10));
        } else {
            duration=CMTimeAdd(duration, CMTimeMake(value, 10));
        }
    }

    [currentAssetLayerInstruction setOpacity:0.0 atTime:duration];
    [arrayInstruction addObject:currentAssetLayerInstruction];
}

AVMutableCompositionTrack *AudioBGTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[AudioBGTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration) ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:CMTimeSubtract(duration, audioAsset.duration) error:nil];

videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, duration);
videoCompositionInstruction.layerInstructions = arrayInstruction;

CGSize naturalSize;
if(isCurrentAssetPortrait){
    naturalSize = CGSizeMake(MAX_HEIGHT,MAX_WIDTH);//currentAssetTrack.naturalSize.height,currentAssetTrack.naturalSize.width);
} else {
    naturalSize = CGSizeMake(MAX_WIDTH,MAX_HEIGHT);//currentAssetTrack.naturalSize;
}

AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.instructions = [NSArray arrayWithObject:videoCompositionInstruction];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = CGSizeMake(naturalSize.width,naturalSize.height);
NSLog(@"prepared");

AVVideoComposition *composition = [videoComposition copy];
AVComposition *mixedComposition = [mixComposition copy];
completion(YES, composition, mixedComposition);
}

然后,我设置了AVPlayer

代码语言:javascript
复制
    -(void)playVideoWithComposition:(AVVideoComposition*)videoComposition inMutableComposition:(AVComposition*)composition{

MBProgressHUD *hud = [MBProgressHUD showHUDAddedTo:self.view animated:YES];
hud.label.text = myLanguage(@"kMergeClip");

savedComposition = [composition copy];
savedVideoComposition = [videoComposition copy];
playerItem = [AVPlayerItem playerItemWithAsset:composition];
playerItem.videoComposition = videoComposition;

[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(repeatVideo:) name:AVPlayerItemDidPlayToEndTimeNotification object:playerItem];

if (!player) {
    player = [AVPlayer playerWithPlayerItem:playerItem];
    layer = [AVPlayerLayer playerLayerWithPlayer:player];
    layer.frame = [UIScreen mainScreen].bounds;
    [self.ibPlayerView.layer insertSublayer:layer atIndex:0];
    NSLog(@"create new player");
}

if (player.currentItem != playerItem ) {
    [player replaceCurrentItemWithPlayerItem:playerItem];
}
player.actionAtItemEnd = AVPlayerActionAtItemEndNone;
//[player seekToTime:kCMTimeZero];

[playerItem addObserver:self
             forKeyPath:@"status"
                options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
                context:@"AVPlayerStatus"];
}

当用户预览所有他们想要的视频并点击保存。我使用此方法导出

代码语言:javascript
复制
    +(void)mergeUserCLip:(AVVideoComposition*)videoComposition withAsset:(AVComposition*)mixComposition showInViewController:(UIViewController*)viewController completion:(void(^)(BOOL success, NSURL *fileURL))completion{

MBProgressHUD *hud = [MBProgressHUD showHUDAddedTo:viewController.view animated:YES];
hud.mode = MBProgressHUDModeDeterminateHorizontalBar;
hud.label.text = myLanguage(@"kMergeClip");

//Name merge clip using beat name
//NSString* beatName = [[[NSString stringWithFormat:@"%@",audioFile] lastPathComponent] stringByDeletingPathExtension];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *tmpDir = [[documentsDirectory stringByDeletingLastPathComponent] stringByAppendingPathComponent:@"tmp"];
NSString *myPathDocs =  [tmpDir stringByAppendingPathComponent:[NSString stringWithFormat:@"merge-beat.mp4"]];
//Not remove here, will remove when call previewPlayVC
[[NSFileManager defaultManager] removeItemAtPath:myPathDocs error:nil];

// 1 - set up the overlay
CALayer *overlayLayer = [CALayer layer];
UIImage *overlayImage = [UIImage imageNamed:@"watermark.png"];

[overlayLayer setContents:(id)[overlayImage CGImage]];
overlayLayer.frame = CGRectMake(720-221, 1280-109, 181, 69);
[overlayLayer setMasksToBounds:YES];

//    aLayer  = [CALayer layer];
//    [aLayer addSublayer:labelLogo.layer];
//    aLayer.frame = CGRectMake(MAX_WIDTH- labelLogo.width - 10.0, MAX_HEIGHT-50.0, 20.0, 20.0);
//    aLayer.opacity = 1;

// 2 - set up the parent layer
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, MAX_HEIGHT,MAX_WIDTH);
videoLayer.frame = CGRectMake(0, 0, MAX_HEIGHT,MAX_WIDTH);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:overlayLayer];

// 3 - apply magic
AVMutableVideoComposition *mutableVideoComposition = [videoComposition copy];
mutableVideoComposition.animationTool = [AVVideoCompositionCoreAnimationTool
                                  videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];

NSURL *url = [NSURL fileURLWithPath:myPathDocs];
myLog(@"Path: %@", myPathDocs);
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPreset1280x720];
exporter.outputURL = url;
exporter.outputFileType = AVFileTypeMPEG4;
exporter.videoComposition = mutableVideoComposition;
exporter.shouldOptimizeForNetworkUse = NO;

[exporter exportAsynchronouslyWithCompletionHandler:^ {
    //NSLog(@"exporting");
    switch (exporter.status) {
        case AVAssetExportSessionStatusCompleted: {
            NSURL *url = [NSURL fileURLWithPath:myPathDocs];
            hud.progress = 1.0f;
            dispatch_async(dispatch_get_main_queue(), ^{
                [MBProgressHUD hideHUDForView:viewController.view animated:YES];
            });
            [self checkTmpSize];
            if (completion) {
                completion(YES, url);
            }
        }
            break;
        case AVAssetExportSessionStatusExporting:
            myLog(@"Exporting!");
            break;
        case AVAssetExportSessionStatusWaiting:
            myLog(@"Waiting");
            break;
        default:
            break;
    }
}];
}

如果选择选项的循环次数少于8次,则上面的代码可以正常工作。如果选择选项的次数超过8次,如果删除此行,导出会话将冻结并显示export.progress = 0.0000000

代码语言:javascript
复制
    playerItem.videoComposition = videoComposition;

然后我无法预览混合视频,但可以正常导出(最多16首曲目)。

或者,如果我删除导出代码中的行:

代码语言:javascript
复制
    exporter.videoComposition = mutableVideoComposition;

然后可以预览混合的视频,并正常导出,而不使用视频合成。

所以我猜AVVideoComposition和/或我实现它的方式有问题。

如果有任何建议,我将不胜感激。非常感谢。

我高度怀疑这是因为使用AVPlayer预览视频以某种方式阻碍了AVAssetExportSession,如以下帖子所述:

iOS 5: Error merging 3 videos with AVAssetExportSession

AVPlayerItem fails with AVStatusFailed and error code “Cannot Decode”

EN

回答 2

Stack Overflow用户

发布于 2017-07-23 02:30:46

我在UICollectionView中尝试连接N个视频,同时在AVPlayer实例中播放最多3个视频时遇到了这个问题。在您链接的Stack Overflow question中已经讨论过,iOS只能处理这么多的AVPlayer实例。每个实例都会耗尽一个“渲染管道”。我发现AVMutableCompositionTrack的每个实例也会占用这些渲染管道中的一个。

因此,如果您使用过多的AVPlayer实例或尝试创建一个包含过多AVMutableCompositionTrack曲目的AVMutableComposition,您可能会耗尽用于解码H264的资源,并且您将收到"Cannot Decode“错误。我只使用了两个AVMutableCompositionTrack实例就解决了这个问题。这样,我可以在应用过渡的同时“重叠”视频片段(这需要同时“播放”两个视频曲目)。

简而言之:最小化您对AVMutableCompositionTrackAVPlayer的使用。你可以查看苹果公司的AVCustomEdit示例代码,以获得这方面的示例。具体地说,检查APLSimpleEditor类内部的buildTransitionComposition方法。

票数 3
EN

Stack Overflow用户

发布于 2017-07-25 15:11:15

尝试此操作,在导出前清除播放器项目

代码语言:javascript
复制
[self.player replaceCurrentItemWithPlayerItem:nil];
票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/35475253

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档