IOS 音视频合成

相关链接:
http://www.jianshu.com/p/8e1c7815af0e
http://www.devzhang.cn/2016/09/09/%E7%BC%96%E8%BE%91Assets/

今天去查询音视频合成相关资料,找到一个demo 觉得很是不错

转文:http://www.jianshu.com/p/9f83af9dbbef

####音视频主要是利用AVFoundation框架下的AVMutableComposition来合成音视频.

####在AVMutableComposition中传入两个数据流,一个是音频一个是视频,之后调用合成方法就可以了

#上代码

##storyBoard中拖入一个button,一个imageView
这里写图片描述

##为了效果好可以将IamgeView的背景色调为黑色

##然后在ViewController中添加以下代码

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>
#import "MBProgressHUD+MJ.h"
@interface ViewController ()
/** 用于播放 */
@property (weak, nonatomic) IBOutlet UIImageView *imageView;
@end
@implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
}
- (IBAction)mergeAction:(UIButton *)sender {
[self merge];
}
// 混合音乐
- (void)merge{
// mbp提示框
[MBProgressHUD showMessage:@"正在处理中"];
// 路径
NSString *documents = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"];
// 声音来源
NSURL *audioInputUrl = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"五环之歌" ofType:@"mp3"]];
// 视频来源
NSURL *videoInputUrl = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"myPlayer" ofType:@"mp4"]];
// 最终合成输出路径
NSString *outPutFilePath = [documents stringByAppendingPathComponent:@"merge.mp4"];
// 添加合成路径
NSURL *outputFileUrl = [NSURL fileURLWithPath:outPutFilePath];
// 时间起点
CMTime nextClistartTime = kCMTimeZero;
// 创建可变的音视频组合
AVMutableComposition *comosition = [AVMutableComposition composition];
// 视频采集
AVURLAsset *videoAsset = [[AVURLAsset alloc] initWithURL:videoInputUrl options:nil];
// 视频时间范围
CMTimeRange videoTimeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration);
// 视频通道 枚举 kCMPersistentTrackID_Invalid = 0
AVMutableCompositionTrack *videoTrack = [comosition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
// 视频采集通道
AVAssetTrack *videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] firstObject];
// 把采集轨道数据加入到可变轨道之中
[videoTrack insertTimeRange:videoTimeRange ofTrack:videoAssetTrack atTime:nextClistartTime error:nil];
// 声音采集
AVURLAsset *audioAsset = [[AVURLAsset alloc] initWithURL:audioInputUrl options:nil];
// 因为视频短这里就直接用视频长度了,如果自动化需要自己写判断
CMTimeRange audioTimeRange = videoTimeRange;
// 音频通道
AVMutableCompositionTrack *audioTrack = [comosition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
// 音频采集通道
AVAssetTrack *audioAssetTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio] firstObject];
// 加入合成轨道之中
[audioTrack insertTimeRange:audioTimeRange ofTrack:audioAssetTrack atTime:nextClistartTime error:nil];
// 创建一个输出
AVAssetExportSession *assetExport = [[AVAssetExportSession alloc] initWithAsset:comosition presetName:AVAssetExportPresetMediumQuality];
// 输出类型
assetExport.outputFileType = AVFileTypeQuickTimeMovie;
// 输出地址
assetExport.outputURL = outputFileUrl;
// 优化
assetExport.shouldOptimizeForNetworkUse = YES;
// 合成完毕
[assetExport exportAsynchronouslyWithCompletionHandler:^{
// 这里还得把状态监听下 assetExport status ,AVAssetExportSessionStatusCompleted
// 回到主线程
dispatch_async(dispatch_get_main_queue(), ^{
// 调用播放方法
[self playWithUrl:outputFileUrl];
});
}];
// assetExport.progress 进度 time 定时去检测
}
/** 播放方法 */
- (void)playWithUrl:(NSURL *)url{
// 传入地址
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithURL:url];
// 播放器
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
// 播放器layer
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
playerLayer.frame = self.imageView.frame;
// 视频填充模式
playerLayer.videoGravity = AVLayerVideoGravityResizeAspect;
// 添加到imageview的layer上
[self.imageView.layer addSublayer:playerLayer];
// 隐藏提示框 开始播放
[MBProgressHUD hideHUD];
[MBProgressHUD showSuccess:@"合成完成"];
// 播放
[player play];
}

##MBP是一个第三方提示类,如果不关心这个功能可以删除这三行代码和头文件

1
2
3
4
5
// mbp提示框
[MBProgressHUD showMessage:@"正在处理中"];
// 隐藏提示框 开始播放
[MBProgressHUD hideHUD];
[MBProgressHUD showSuccess:@"合成完成"];

这里写图片描述

##GitHub:https://github.com/Lafree317/MergeVideoAndMusic

补充: 单音轨, 多音频合成

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
// 声音采集2
AVURLAsset *audioAsset2 = [[AVURLAsset alloc] initWithURL:audioInputUrl2 options:nil];
// 因为视频短这里就直接用视频长度了,如果自动化需要自己写判断 , 注: 如果两个音频接着播放的得,得考虑 两个长度问题, 这个会以长的做节点, 自己计算基准事件和音频时间
CMTime aduioTime2 = CMTimeMakeWithSeconds(10, audioAsset2.duration.timescale);//audioAsset2.duration;//CMTimeMake(videoAsset.duration.value - audioAsset.duration.value, audioAsset2.duration.timescale);
CMTimeRange audioTimeRange2 = CMTimeRangeMake(kCMTimeZero, aduioTime2);//videoTimeRange;
// 音频通道
// AVMutableCompositionTrack *audioTrack2 = [comosition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
// 音频采集通道
AVAssetTrack *audioAssetTrack2 = [[audioAsset2 tracksWithMediaType:AVMediaTypeAudio] firstObject];
// 加入合成轨道之中
//[audioTrack insertTimeRange:audioTimeRange ofTrack:audioAssetTrack atTime:nextClistartTime error:nil];
[audioTrack insertTimeRanges:@[[NSValue valueWithCMTimeRange:audioTimeRange],[NSValue valueWithCMTimeRange:audioTimeRange2]] ofTracks:@[audioAssetTrack,audioAssetTrack2] atTime:nextClistartTime error:nil];

补充3 : 多音轨合成
2 也可以直接取出视频的音轨进行合并,达到保留视频音轨(没测试)

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
// 声音采集2
AVURLAsset *audioAsset2 = [[AVURLAsset alloc] initWithURL:audioInputUrl2 options:nil];
// 因为视频短这里就直接用视频长度了,如果自动化需要自己写判断 , 注: 如果两个音频接着播放的得,得考虑 两个长度问题, 这个会以长的做节点, 自己计算基准事件和音频时间
CMTime aduioTime2 = audioAsset2.duration;//CMTimeMake(videoAsset.duration.value - audioAsset.duration.value, audioAsset2.duration.timescale);
CMTimeRange audioTimeRange2 = CMTimeRangeMake(kCMTimeZero, aduioTime2);//videoTimeRange;
// 音频通道
AVMutableCompositionTrack *audioTrack2 = [comosition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
// 音频采集通道
AVAssetTrack *audioAssetTrack2 = [[audioAsset2 tracksWithMediaType:AVMediaTypeAudio] firstObject];
// 加入合成轨道之中
[audioTrack insertTimeRange:audioTimeRange ofTrack:audioAssetTrack atTime:nextClistartTime error:nil];
[audioTrack2 insertTimeRange:audioTimeRange2 ofTrack:audioAssetTrack2 atTime:nextClistartTime error:nil];

// 翻译https://changjianfeishui.gitbooks.io/avfoundation-programming-guide/content/chapter1.html
挺不错,可供参考


1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
双轨道合成备注下 但是最好一个轨道
/**
片头拼接
@param exportVideoFilePath <#exportVideoFilePath description#>
@param inputeVideoURL <#inputeVideoURL description#>
@param videoHiveURL <#videoHiveURL description#>
*/
-(void)builderVideoHiveToMp4Pth: (NSString *)exportVideoFilePath
withInputeVideoURL: (NSURL *)inputeVideoURL
withVideoHiveURL: (NSURL*)videoHiveURL{
_effectType = 1;
unlink([exportVideoFilePath UTF8String]);
if (!inputeVideoURL || ![inputeVideoURL isFileURL] || !exportVideoFilePath || [exportVideoFilePath isEqualToString:@""]){
//没有输入或者输出等, 返回NO
NSLog(@"inputeVideoURL 或者 exportVideoFilePath 地址有错");
return;
}
// 创建asset 从输入url
AVURLAsset *videoAsset = [[AVURLAsset alloc] initWithURL:inputeVideoURL options:nil];
AVURLAsset *videoAsset2 = [[AVURLAsset alloc] initWithURL:videoHiveURL options:nil];
if (videoAsset == nil || [[videoAsset tracksWithMediaType:AVMediaTypeVideo] count]<1 || [[videoAsset2 tracksWithMediaType:AVMediaTypeVideo] count]<1){
// 没有建立成功,或者没有视频轨道 + 音频判断
return;
}
NSParameterAssert(videoAsset);
// 视频和音频的合成类
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
/// MARK视频轨道 合成类 先add 的会显示在最底层
// 空的轨道
AVMutableCompositionTrack *videoTrack2 = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
// 轨道数据
AVAssetTrack *assetVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *assetVideoTrack2 = [[videoAsset2 tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[videoTrack2 insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset2.duration ) ofTrack:assetVideoTrack2 atTime:kCMTimeZero error:nil];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration ) ofTrack:assetVideoTrack atTime:videoAsset2.duration error:nil];
// layer 数据轨道的合成指令集
AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:assetVideoTrack];
AVMutableVideoCompositionLayerInstruction *videolayerInstruction2 = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:assetVideoTrack2];
[videolayerInstruction2 setOpacity:0.0 atTime:videoAsset2.duration];
// AVMutableVideoCompositionLayerInstruction *audiolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:assetAudioTrack];
//Make a "pass through video track" video composition.
/// 视频合成的指令
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeAdd(videoAsset.duration, videoAsset2.duration));
mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,videolayerInstruction2, nil];
// 视频合成类
AVMutableVideoComposition *mainVideoComposition = [[AVMutableVideoComposition alloc] init];
mainVideoComposition.instructions = [NSArray arrayWithObjects:mainInstruction, nil];
mainVideoComposition.frameDuration = CMTimeMake(1, 30);
mainVideoComposition.renderSize = assetVideoTrack.naturalSize;
// 所有效果的父类
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.contentsScale = [UIScreen mainScreen].scale;
videoLayer.contentsScale = [UIScreen mainScreen].scale;
parentLayer.frame = CGRectMake(0, 0, assetVideoTrack.naturalSize.width, assetVideoTrack.naturalSize.height);
videoLayer.frame = parentLayer.frame;
[parentLayer addSublayer:videoLayer];
mainVideoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
if (_animationlayer){
[_animationlayer setAllFrame:parentLayer.bounds];
[parentLayer addSublayer:_animationlayer];
}
// export to mp4
NSString *mp4Quality = AVAssetExportPresetHighestQuality;
NSString *exportPath = exportVideoFilePath;
NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
_videotoTalTime = CMTimeGetSeconds(videoAsset.duration);
_exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:mp4Quality];
_exportSession.outputURL = exportUrl;
_exportSession.outputFileType = AVFileTypeQuickTimeMovie;// 转换格式
_exportSession.shouldOptimizeForNetworkUse = YES;
_exportSession.videoComposition = mainVideoComposition;
__weak typeof(self) weakSelf = self;
dispatch_async(dispatch_get_main_queue(), ^{
// 定时更新进度
_timerEffect = [NSTimer CZ_scheduledTimerWithTimeInterval:0.1f repeats:YES callback:^{
[weakSelf retrievingProgressMP4];
}];
});
[_exportSession exportAsynchronouslyWithCompletionHandler:^{
if (!weakSelf){return;}
NSLog(@"转码状态 %ld",(long)[weakSelf.exportSession status]);
switch ([weakSelf.exportSession status]) {
case AVAssetExportSessionStatusCompleted:
{
dispatch_async(dispatch_get_main_queue(), ^{
if (weakSelf.delegate && [weakSelf.delegate respondsToSelector:@selector(AVAssetExportVideoHiveMP4SessionStatus:)]){
[weakSelf.delegate AVAssetExportVideoHiveMP4SessionStatus:[weakSelf.exportSession status]];
}
[weakSelf clearAll];
});
NSLog(@"视频转码成功");
break;
}
case AVAssetExportSessionStatusFailed:
{
dispatch_async(dispatch_get_main_queue(), ^{
if (weakSelf.delegate && [weakSelf.delegate respondsToSelector:@selector(AVAssetExportVideoHiveMP4SessionStatus:)]){
[weakSelf.delegate AVAssetExportVideoHiveMP4SessionStatus:[weakSelf.exportSession status]];
}
[weakSelf clearAll];
});
NSLog(@"视频转码失败%@",[weakSelf.exportSession error]);
break;
}
case AVAssetExportSessionStatusCancelled:
break;
case AVAssetExportSessionStatusWaiting:
break;
case AVAssetExportSessionStatusExporting:
break;
case AVAssetExportSessionStatusUnknown:
default:
break;
}
}];
}

/// 后期看到
来源
AVURLAssetPreferPreciseDurationAndTimingKey的选项,选项带有@YES值可以确保当资源的属性使用AVAsynchronousKeyValueLoading协议载入时可以计算出准确的时长和时间信息。虽然使用这个option时还会载入过程增添一些额外开销,不过这可以保证资源正处于合适的编辑状态。生成水印的代码如下:
NSDictionary opts = [NSDictionary dictionaryWithObject:@(YES) forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVURLAsset
videoAsset = [AVURLAsset URLAssetWithURL:videoPathURL options:opts]; //初始化视频媒体文件

Author

陈昭

Posted on

2016-10-19

Updated on

2021-12-27

Licensed under

You need to set install_url to use ShareThis. Please set it in _config.yml.
You forgot to set the business or currency_code for Paypal. Please set it in _config.yml.

Kommentare

You forgot to set the shortname for Disqus. Please set it in _config.yml.