是否可以使用AVAsset创建一个对象--两个urls,一个用于音频,另一个用于视频轨道?
我已经用AVMutableComposition尝试过了,但是它似乎首先加载了整个内容,并在video+audio回放开始之前对其进行缓冲。在AVComposition的文档中,它说基于文件的资产可以合并,但是我需要一种方法来组合基于url的资产。
或者,是否有一个选项可以为AVComposition设置,以便在加载整个内容之前启动播放?
编辑
我就是这样试的:
NSDictionary *urlAssetOptions = @{AVURLAssetPreferPreciseDurationAndTimingKey: [NSNumber numberWithBool:NO]};
AVMutableComposition *composition = [AVMutableComposition composition];
NSURL *audioUrl = [NSURL URLWithString:@"http://..."];
AVURLAsset *audioAsset = [AVURLAsset URLAssetWithURL:audioUrl options:urlAssetOptions];
AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration) ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
NSURL *videoUrl = [NSURL URLWithString:@"http://..."];
AVURLAsset *videoAsset = [AVURLAsset URLAssetWithURL:videoUrl options:urlAssetOptions];
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:composition];发布于 2016-10-20 19:41:10
您使用的解决方案不需要加载整个内容来开始创建可变的组合,也不需要开始播放您创建的组合。但是,它需要加载媒体文件的一部分,以确定每个文件的持续时间和跟踪。
下面是工作代码,它使用urls到mp3和谷歌找到的mp4文件来创建可变的组合并将其传递给AVPlayerViewController。如果您运行代码,您可以注意到它开始播放相当快,但如果您跳过视频时间线,您会发现它需要很长的时间来加载数据的请求时间。
NSURL *audioURL = [NSURL URLWithString:@"http://www.mfiles.co.uk/mp3-downloads/Toccata-and-Fugue-Dm.mp3"];
AVAsset *audioAsset = [AVAsset assetWithURL:audioURL];
NSURL *videoURL = [NSURL URLWithString:@"http://thv1.uloz.to/6/c/4/6c4b50308843dd29c9176cc2c4961155.360.mp4?fileId=20389770"];
AVAsset *videoAsset = [AVAsset assetWithURL:videoURL];
CMTime duration;
if (CMTimeGetSeconds(audioAsset.duration) < CMTimeGetSeconds(videoAsset.duration)) {
duration = audioAsset.duration;
} else {
duration = videoAsset.duration;
}
NSError *error;
AVMutableComposition* mixAsset = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack* audioTrack = [mixAsset addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, duration) ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error: &error];
AVMutableCompositionTrack* videoTrack = [mixAsset addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error: &error];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:mixAsset];
AVPlayerViewController* playerController = [AVPlayerViewController new];
playerController.player = [AVPlayer playerWithPlayerItem:playerItem];
[self presentViewController:playerController animated:YES completion:nil];https://stackoverflow.com/questions/40113274
复制相似问题