https://www.jianshu.com/p/7b367d5493a2
demo
https://github.com/sundayios/KJAlbum-master
2017.10.30 1
修改了GPUImage导入的方式,也解决了视频编辑后出现90旋转的问题。
网上流行的美颜滤镜有很多,我的demo里有两款美颜滤镜(GPUImageBeautifyFilter
、FSKGPUImageBeautyFilter
)。
我本意是准备写一个相册管理的,最后由于项目需求,写成了短视频处理,所以项目名字没改,demo下面会给出,只是给大家一个相关思路,不惜勿喷。下面直接上我demo里比较重要的几个地方:
短视频功能使用的是GPUImageVideoCamera
,由于项目里需要使用1:1,所以使用了滤镜GPUImageCropFilter
,大家如果有需要,可以更高这个滤镜的设置,下面给出相机的设置代码:
//相机设置
- (void)customSystemSession {
WS(weakSelf)
self.imgView = [UIImageView new];
self.imgView.clipsToBounds = YES;
[self.view addSubview:self.imgView];
[self.imgView mas_makeConstraints:^(MASConstraintMaker *make) {
make.top.equalTo(weakSelf.topView.mas_bottom).offset(0);
make.left.equalTo(weakSelf.view.mas_left).offset(0);
make.right.equalTo(weakSelf.view.mas_right).offset(0);
make.height.mas_equalTo(SCREEN_WIDTH);
}];
//美颜相机
self.kj_videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset1280x720 cameraPosition:AVCaptureDevicePositionBack];
self.kj_videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
self.kj_videoCamera.horizontallyMirrorFrontFacingCamera = YES;
self.kj_filterView = [[GPUImageView alloc] initWithFrame:CGRectMake(0, 0, SCREEN_WIDTH, SCREEN_WIDTH)];
self.kj_filterView.fillMode = kGPUImageFillModePreserveAspectRatioAndFill;
// self.filterView.center = self.view.center;
[self.imgView addSubview:self.kj_filterView];
// //剪裁滤镜(1:1)如果有需要全屏,这里需要做出更改或者直接去掉这个滤镜
self.kj_cropFilter = [[GPUImageCropFilter alloc] initWithCropRegion:CGRectMake(0, 44/SCREEN_HEIGHT, 1, SCREEN_WIDTH/SCREEN_HEIGHT)];
//美颜(默认开启美颜)
self.kj_beautifyFilter = [[FSKGPUImageBeautyFilter alloc] init];
self.kj_beautifyFilter.beautyLevel = 0.9f;//美颜程度
self.kj_beautifyFilter.brightLevel = 0.7f;//美白程度
self.kj_beautifyFilter.toneLevel = 0.9f;//色调强度
// self.kj_filter = [[GPUImageSaturationFilter alloc] init];
// //滤镜组
// self.kj_filterGroup = [[GPUImageFilterGroup alloc] init];
// [self.kj_filterGroup addFilter:self.kj_cropFilter];
// [self.kj_filterGroup addFilter:self.kj_beautifyFilter];
// [self openBeautify];
[self.kj_videoCamera addAudioInputsAndOutputs];
[self.kj_videoCamera addTarget:self.kj_cropFilter];
[self.kj_cropFilter addTarget:self.kj_beautifyFilter];
[self.kj_beautifyFilter addTarget:self.kj_filterView];
[self.kj_videoCamera startCameraCapture];
}
对于相机的前后镜头切换、闪光灯等等这里就不做说明了,对于短视频我的处理方案是多段视频合并,用GPUImageVideoCamera
来录制多段视频,带美颜滤镜,其实这里是可以直接使用实时滤镜的,但是由于公司只需要美颜,所以这里没做其它滤镜的处理(实际上和美颜滤镜的切换时一样的道理,所以实时滤镜实际上是在每段视频拍摄前选好滤镜后addTarget就好了,当然啦,如果是混合使用后,要先删除(removeTarget)后添加),但是在后续我单独写了一个视频的简单编辑功能,那里有对于本地视频、图片增加滤镜的处理,后续会讲到,接下来放上短视频中最重要的部分,视频合并:
//确定(完成)
- (void)onCompleteAction:(UIButton *)sender {
if (self.kj_videoArray.count > 0) {
if (self.kj_videoArray.count > 1) {
//需要合并多段视频
if (!self.kj_outPath) {
self.kj_outPath = [self getVideoOutPath];//合成后的输出路径
}
//判断本地是否已有合成后的视频文件
if ([[NSFileManager defaultManager] fileExistsAtPath:self.kj_outPath]) {
//如果存在就删除,重新合成
[[NSFileManager defaultManager] removeItemAtPath:self.kj_outPath error:nil];
}
//音视频合成工具
AVMutableComposition *kj_composition = [AVMutableComposition composition];
//音频
AVMutableCompositionTrack *kj_audioTrack = [kj_composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
//视频
AVMutableCompositionTrack *kj_videoTrack = [kj_composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
//开始合成
[KJUtility showProgressDialogText:@"视频处理中..."];
CMTime kj_totalDuration = kCMTimeZero;
for (int i = 0; i < self.kj_videoArray.count; i ++) {
NSDictionary *localDict = self.kj_videoArray[i];
NSDictionary* options = @{AVURLAssetPreferPreciseDurationAndTimingKey:@YES};
AVAsset *kj_asset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:localDict[@"path"]] options:options];
//获取kj_asset中的音频
NSArray *audioArray = [kj_asset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *kj_assetAudio = audioArray.firstObject;
//向kj_audioTrack中加入音频
NSError *kj_audioError = nil;
BOOL isComplete_audio = [kj_audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, kj_asset.duration)
ofTrack:kj_assetAudio
atTime:kj_totalDuration
error:&kj_audioError];
NSLog(@"加入音频%d isComplete_audio:%d error:%@", i, isComplete_audio, kj_audioError);
//获取kj_asset中的视频
NSArray *videoArray = [kj_asset tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *kj_assetVideo = videoArray.firstObject;
//向kj_videoTrack中加入视频
NSError *kj_videoError = nil;
BOOL isComplete_video = [kj_videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, kj_asset.duration)
ofTrack:kj_assetVideo
atTime:kj_totalDuration
error:&kj_videoError];
NSLog(@"加入视频%d isComplete_video:%d error:%@", i, isComplete_video, kj_videoError);
kj_totalDuration = CMTimeAdd(kj_totalDuration, kj_asset.duration);
}
//这里可以加水印的,但在这里不做水印处理
//视频导出处理
AVAssetExportSession *kj_export = [AVAssetExportSession exportSessionWithAsset:kj_composition
presetName:AVAssetExportPreset1280x720];
kj_export.outputURL = [NSURL fileURLWithPath:self.kj_outPath];
kj_export.outputFileType = AVFileTypeMPEG4;
kj_export.shouldOptimizeForNetworkUse = YES;
WS(weakSelf)
[kj_export exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
[KJUtility hideProgressDialog];
if (weakSelf.kjFileDelegate && [weakSelf.kjFileDelegate respondsToSelector:@selector(kj_videoFileCompleteLocalPath:)]) {
//合成视频成功后,删除小段视频
[weakSelf clearAllVideo];
NSLog(@"%@",weakSelf.kj_outPath);
[weakSelf.kjFileDelegate kj_videoFileCompleteLocalPath:weakSelf.kj_outPath];
} else {
[weakSelf saveVideoToLibrary];
}
});
}];
} else {
//只有一段视频
[KJUtility hideProgressDialog];
NSDictionary *dict = self.kj_videoArray.firstObject;
self.kj_outPath = dict[@"path"];
if (self.kjFileDelegate && [self.kjFileDelegate respondsToSelector:@selector(kj_videoFileCompleteLocalPath:)]) {
[self.kjFileDelegate kj_videoFileCompleteLocalPath:self.kj_outPath];
} else {
[self saveVideoToLibrary];
}
}
}
}
注释都说的很清楚了,就不对代码做相关解释了,最主要的还
是AVMutableComposition
、AVMutableCompositionTrack
、AVMutableCompositionTrack
的简单实用,百度一下就能看到相关属性的说明,其它的就和普通相机的处理没什么不同的。
另一个方面,就是视频的剪辑了,大家对于上传服务器的视频,由于时间太长或者是视频太大,这就需要对视频进行压缩和裁剪处理,这里对于时间的选取UI层上不做说明了,可以看的demo里的UI处理,下面给出视频剪裁的代码:
- (void)onCompleteButtonAction:(UIButton *)sender {
//开始剪裁
[self.kj_player pause];
//开始时间
CMTime startTime = CMTimeMakeWithSeconds((self.collectionView.contentOffset.x+self.btnStart.frame.origin.x)*self.pixel_time, self.kj_player.currentItem.duration.timescale);
//长度
CGFloat length = (self.btnEnd.frame.origin.x+self.btnEnd.frame.size.width-self.btnStart.frame.origin.x)*self.pixel_time;
CMTime time_total = [self.kj_player.currentItem duration];
if (length == 1.0*time_total.value/time_total.timescale) {
if (self.kj_videoCapturedelegate && [self.kj_videoCapturedelegate respondsToSelector:@selector(kj_didCaptureCompleteForPath:)]) {
AVURLAsset *urlAsset = (AVURLAsset *)self.kj_player.currentItem.asset;
[self.kj_videoCapturedelegate kj_didCaptureCompleteForPath:urlAsset.URL.path];
} else {
[self onCancelButtonAction:nil];
}
return;
}
[KJUtility showProgressDialogText:@"开始处理"];
if (length > self.kj_maxTime) {
length = self.kj_maxTime;
}
CMTime videoLenth = CMTimeMakeWithSeconds(length, self.kj_player.currentItem.duration.timescale);
CMTimeRange videoTimeRange = CMTimeRangeMake(startTime, videoLenth);
AVAssetExportSession * exportSession = [[AVAssetExportSession alloc] initWithAsset:self.kj_player.currentItem.asset presetName:AVAssetExportPresetMediumQuality];
exportSession.timeRange = videoTimeRange;
NSString *path = [KJUtility kj_getKJAlbumFilePath];
NSDateFormatter *formatter = [[NSDateFormatter alloc] init];
[formatter setDateFormat:@"yyyyMMddHHmmss"];
NSString *fileName = [NSString stringWithFormat:@"%@-%@",[formatter stringFromDate:[NSDate date]], @"kj_video.mp4"];
path = [path stringByAppendingPathComponent:fileName];
exportSession.outputURL = [NSURL fileURLWithPath:path];
exportSession.outputFileType = AVFileTypeMPEG4;
__block BOOL completeOK = NO;
WS(weakSelf)
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status) {
case AVAssetExportSessionStatusUnknown:
break;
case AVAssetExportSessionStatusWaiting:
break;
case AVAssetExportSessionStatusExporting:
break;
case AVAssetExportSessionStatusCompleted:
completeOK = YES;
break;
case AVAssetExportSessionStatusFailed:
break;
case AVAssetExportSessionStatusCancelled:
break;
};
if (completeOK) {
dispatch_async(dispatch_get_main_queue(), ^{
[KJUtility showAllTextDialog:weakSelf.view Text:@"视频截取成功"];
if (weakSelf.kj_videoCapturedelegate && [weakSelf.kj_videoCapturedelegate respondsToSelector:@selector(kj_didCaptureCompleteForPath:)]) {
[KJUtility hideProgressDialog];
[weakSelf.kj_videoCapturedelegate kj_didCaptureCompleteForPath:path];
} else {
//保存到相册
[KJUtility kj_saveVideoToLibraryForPath:path completeHandler:^(NSString *localIdentifier, BOOL isSuccess) {
if (isSuccess) {
NSFileManager *fileManger = [[NSFileManager alloc] init];
[fileManger removeItemAtPath:path error:nil];
}
dispatch_async(dispatch_get_main_queue(), ^{
[KJUtility hideProgressDialog];
[weakSelf onCancelButtonAction:nil];
});
}];
}
});
} else {
[KJUtility showAllTextDialog:weakSelf.view Text:@"视频截取失败"];
}
dispatch_async(dispatch_get_main_queue(), ^{
[KJUtility hideProgressDialog];
});
}];
}
接下来说说利用GPUImage对本地视频加滤镜,如果需要选取滤镜后,能实时看到滤镜效果,需要用到GPUImageMovie
来预览,滤镜合成还需要使用到GPUImageMovieWriter
,由于GPUImageMovie
预览视频时是没有声音的,所以还需要一个播放器来播放视频,这个可以看我的demo处理,下面贴出对本地视频合成滤镜的处理:
//合成滤镜
- (void)filterCompositionForFilter:(GPUImageOutput<GPUImageInput> *)filter withVideoUrl:(NSURL *)videoUrl {
if (videoUrl) {
WS(weakSelf)
GPUImageOutput<GPUImageInput> *tmpFilter = filter;
kj_movieComposition = [[GPUImageMovie alloc] initWithURL:videoUrl];
kj_movieComposition.runBenchmark = YES;
kj_movieComposition.playAtActualSpeed = NO;
[kj_movieComposition addTarget:tmpFilter];
//合成后的视频路径
NSString *newPath = [KJUtility kj_getKJAlbumFilePath];
newPath = [newPath stringByAppendingPathComponent:[KJUtility kj_getNewFileName]];
unlink([newPath UTF8String]);
NSLog(@"%f,%f",self.kj_player.currentItem.presentationSize.height,self.kj_player.currentItem.presentationSize.width);
CGSize videoSize = self.kj_player.currentItem.presentationSize;
NSURL *tmpUrl = [NSURL fileURLWithPath:newPath];
[self.kj_newVideoPathArray addObject:tmpUrl];
kj_movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:tmpUrl size:videoSize];
kj_movieWriter.shouldPassthroughAudio = YES;
//这里原GPUImage始没有这个属性的,修改后才有
kj_movieWriter.allowWriteAudio = YES;
kj_movieComposition.audioEncodingTarget = kj_movieWriter;
[tmpFilter addTarget:kj_movieWriter];
[kj_movieComposition enableSynchronizedEncodingUsingMovieWriter:kj_movieWriter];
[kj_movieWriter startRecording];
[kj_movieComposition startProcessing];
__weak GPUImageMovieWriter *weakmovieWriter = kj_movieWriter;
[kj_movieWriter setCompletionBlock:^{
NSLog(@"滤镜添加成功");
[tmpFilter removeTarget:weakmovieWriter];
[weakmovieWriter finishRecording];
dispatch_async(dispatch_get_main_queue(), ^{
if (weakSelf.kj_selectedMusic) {
//合成音乐
[weakSelf musicCompositionForMusicInfo:weakSelf.kj_selectedMusic withVideoPath:weakSelf.kj_newVideoPathArray.lastObject];
} else {
[weakSelf saveVideoToLib];
}
});
}];
[kj_movieWriter setFailureBlock:^(NSError *error) {
dispatch_async(dispatch_get_main_queue(), ^{
NSLog(@"滤镜添加失败:%@", error);
if ([[NSFileManager defaultManager] fileExistsAtPath:newPath]) {
NSError *delError = nil;
[[NSFileManager defaultManager] removeItemAtPath:newPath error:&delError];
if (delError) {
NSLog(@"删除沙盒路径失败:%@", delError);
}
}
[weakSelf.kj_newVideoPathArray removeLastObject];
[KJUtility hideProgressDialog];
});
}];
}
}
下面来说说对于本地视频添加音乐的处理,添加音乐这里我的注释写的很清楚,直接上代码(还是利用前面提到的是哪个类来处理的,原理就是把视频和音频单独提出来,最后放在工具里合并):
//合成音乐
- (void)musicCompositionForMusicInfo:(NSDictionary *)musicInfo withVideoPath:(NSURL *)videoUrl {
if (musicInfo && videoUrl) {
//音乐
NSString *audioPath = [[NSBundle mainBundle] pathForResource:musicInfo[@"music"] ofType:@"mp3"];
NSURL *audioUrl = [NSURL fileURLWithPath:audioPath];
//合成后的视频输出路径
NSString *newPath = [KJUtility kj_getKJAlbumFilePath];
newPath = [newPath stringByAppendingPathComponent:[KJUtility kj_getNewFileName]];
unlink([newPath UTF8String]);
NSURL *newVideoPath = [NSURL fileURLWithPath:newPath];
//合成工具
AVMutableComposition *kj_composition = [AVMutableComposition composition];
//音频
AVMutableCompositionTrack *kj_audioTrack = [kj_composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
//视频
AVMutableCompositionTrack *kj_videoTrack = [kj_composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
NSDictionary* kj_options = @{AVURLAssetPreferPreciseDurationAndTimingKey:@YES};
//视频AVAsset
AVURLAsset *kj_videoAsset = [[AVURLAsset alloc] initWithURL:videoUrl options:kj_options];
//视频时间范围(合成的音乐不能超过这个时间范围)
CMTimeRange kj_videoTimeRange = CMTimeRangeMake(kCMTimeZero, kj_videoAsset.duration);
//采集kj_videoAsset中的视频
NSArray *videoArray = [kj_videoAsset tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *kj_assetVideo = videoArray.firstObject;
//采集的视频加入到视频通道kj_videoTrack
NSError *kj_videoError = nil;
BOOL isComplete_video = [kj_videoTrack insertTimeRange:kj_videoTimeRange
ofTrack:kj_assetVideo
atTime:kCMTimeZero
error:&kj_videoError];
NSLog(@"加入视频isComplete_video:%d error:%@",isComplete_video, kj_videoError);
//音频AVAsset
AVURLAsset *kj_audioAsset = [[AVURLAsset alloc] initWithURL:audioUrl options:kj_options];
//采集kj_audioAsset中的音频
NSArray *audioArray = [kj_audioAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *kj_assetAudio = audioArray.firstObject;
//音频的范围
CMTimeRange kj_audioTimeRange = CMTimeRangeMake(kCMTimeZero, kj_audioAsset.duration);
if (CMTimeCompare(kj_audioAsset.duration, kj_videoAsset.duration)) {//当视频时间小于音频时间
kj_audioTimeRange = CMTimeRangeMake(kCMTimeZero, kj_videoAsset.duration);
}
//采集的音频加入到音频通道kj_audioTrack
NSError *kj_audioError = nil;
BOOL isComplete_audio = [kj_audioTrack insertTimeRange:kj_audioTimeRange
ofTrack:kj_assetAudio
atTime:kCMTimeZero
error:&kj_audioError];
NSLog(@"加入音频isComplete_audio:%d error:%@",isComplete_audio, kj_audioError);
//因为要保存相册,所以设置高质量, 这里可以根据实际情况进行更改
WS(weakSelf)
[KJUtility kj_compressedVideoAsset:kj_composition withPresetName:AVAssetExportPresetHighestQuality withNewSavePath:newVideoPath withCompleteBlock:^(NSError *error) {
dispatch_async(dispatch_get_main_queue(), ^{
if (error) {
NSLog(@"转码失败:%@", error);
[KJUtility hideProgressDialog];
} else {
[weakSelf.kj_newVideoPathArray addObject:newVideoPath];
[weakSelf saveVideoToLib];
}
});
}];
}
}
视频压缩这个不用多说了,网上太多了,基本上是iOS自带的就有这个方法,很简单的几行代码:
/**
视频转码/压缩
@param asset AVAsset
@param presetName 视频质量(建议压缩使用AVAssetExportPresetMediumQuality,存相册AVAssetExportPreset1920x1080,根据需求设置)
@param savePath 保存的路径
@param completeBlock 返回状态
*/
+ (void)kj_compressedVideoAsset:(AVAsset *)asset
withPresetName:(NSString *)presetName
withNewSavePath:(NSURL *)savePath
withCompleteBlock:(void(^)(NSError *error))completeBlock {
AVAssetExportSession *kj_export = [AVAssetExportSession exportSessionWithAsset:asset
presetName:presetName];
kj_export.outputURL = savePath;
kj_export.outputFileType = AVFileTypeMPEG4;
kj_export.shouldOptimizeForNetworkUse = YES;
[kj_export exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
if (kj_export.status == AVAssetExportSessionStatusCompleted) {
if (completeBlock) {
completeBlock(nil);
}
} else if (kj_export.status == AVAssetExportSessionStatusFailed) {
if (completeBlock) {
completeBlock(kj_export.error);
}
} else {
NSLog(@"当前压缩进度:%f",kj_export.progress);
}
});
}];
}
这里说的都是短视频的处理方案,事实上demo里有相册的处理,但是没做实时更新的处理,所以有实时更新的需要你自己添加代码监听并更新显示,当然啦,我的UI肯定不适合大家,所以这里只是提供一种方法,不是写的工具能直接拿来就能使用,demo支持iOS8*。下面给出整个demo的github:
Kegendemo很粗糙,但是注释都有写,勿直接使用、、、GPUImage很强大,还有很多都没使用到,有需要可以研究研究,很实用。demo中视频会在相册里存两份,一份是压缩后的,一个是没压缩的,这个是我做测试的时候存的,如果大家在测试的时候不需要,这个可以在调用的时候可以去掉保存,我的调用代码:
:
- (IBAction)onVideoButtonAction:(UIButton *)sender {
KJVideoAlbumController *ctrl = [[KJVideoAlbumController alloc] init];
ctrl.kj_minTime = 2.0;
ctrl.kj_maxTime = 15.0f;
WS(weakSelf)
ctrl.kj_complete = ^(NSURL *outPath) {
dispatch_async(dispatch_get_main_queue(), ^{
[weakSelf editViewPath:outPath];
});
};
UINavigationController *navc = [[UINavigationController alloc] initWithRootViewController:ctrl];
navc.modalTransitionStyle = UIModalTransitionStyleCrossDissolve;
[self presentViewController:navc animated:YES completion:nil];
}
- (void)editVideo:(NSString *)localIdentifier {
WS(weakSelf)
[KJUtility kj_getAssetForLocalIdentifier:localIdentifier completionHandler:^(PHAsset *kj_object) {
dispatch_async(dispatch_get_main_queue(), ^{
[KJUtility kj_requestVideoForAsset:kj_object completion:^(AVURLAsset *asset) {
dispatch_async(dispatch_get_main_queue(), ^{
KJEditVideoViewController *ctrl = [[KJEditVideoViewController alloc] init];
ctrl.kj_localVideo = asset;
ctrl.modalTransitionStyle = UIModalTransitionStyleCrossDissolve;
UINavigationController *navc = [[UINavigationController alloc] initWithRootViewController:ctrl];
[weakSelf presentViewController:navc animated:YES completion:nil];
});
}];
});
}];
}
- (void)editViewPath:(NSURL *)path {
KJEditVideoViewController *ctrl = [[KJEditVideoViewController alloc] init];
ctrl.kj_localVideo = path;
ctrl.kj_isSelectCover = YES;
WS(weakSelf)
/**
* videoPath压缩后的视频
* localIdentifier保存到相册的高质量视频(如果没有添加滤镜或音乐,返回nil)
* kj_cover 封面图
*/
ctrl.editCompleteBlock = ^(NSURL *videoPath, NSString *localidentifier, UIImage *kj_cover) {
[weakSelf saveVideoToLibVideoUrl:videoPath];
};
ctrl.modalTransitionStyle = UIModalTransitionStyleCrossDissolve;
UINavigationController *navc = [[UINavigationController alloc] initWithRootViewController:ctrl];
[self presentViewController:navc animated:YES completion:nil];
}
//保存到相册
- (void)saveVideoToLibVideoUrl:(NSURL *)url {
[KJUtility kj_saveVideoToLibraryForPath:url.path completeHandler:^(NSString *localIdentifier, BOOL isSuccess) {
if (isSuccess) {
NSLog(@"保存到相册成功");
} else {
NSLog(@"保存到相册失败");
}
}];
}
另外可能大家在看这代码里面有很多都是调用KJUtility的代码,把h文件放出来就知道了:
//
// KJUtility.h
// KJAlbumDemo
//
// Created by JOIN iOS on 2017/9/5.
// Copyright © 2017年 Kegem. All rights reserved.
//
#import <Foundation/Foundation.h>
#import <Photos/Photos.h>
#import "UIKit+BaseExtension.h"
#import <YYWebImage.h>
#import <Masonry.h>
#import <YYAnimatedImageView.h>
#import <GPUImage/GPUImage.h>
@protocol KJCustomCameraDelegate <NSObject>
- (void)kj_didStartTakeAction;
- (void)kj_didResetTakeAction;
- (void)kj_didCompleteAction;
- (void)kj_didCancelAction;
@end
@protocol KJVideoFileDelegate <NSObject>
//这里是不保存到相册,返回录制完后的地址
- (void)kj_videoFileCompleteLocalPath:(NSString *)kj_outPath;
@end
//状态条的高
#define StatusBarHeight [[UIApplication sharedApplication] statusBarFrame].size.height
//得到屏幕bounds
#define SCREEN_SIZE [UIScreen mainScreen].bounds
//得到屏幕height
#define SCREEN_HEIGHT [UIScreen mainScreen].bounds.size.height
//得到屏幕width
#define SCREEN_WIDTH [UIScreen mainScreen].bounds.size.width
//self
#define WS(weakSelf) __weak typeof (self) weakSelf = self;
//颜色
#define sYellowColor 0xffd700
@interface KJUtility : NSObject
+ (void)showAllTextDialog:(UIView *)view Text:(NSString *)text;
+ (void)showProgressDialogText:(NSString *)text;
+ (void)hideProgressDialog;
//传入 秒 得到 xx分钟xx秒
+ (NSString *)getMMSSFromSS:(NSInteger)seconds;
/**
拍摄的图片/视频等存放的路径
@return 文件路径
*/
+ (NSString *)kj_getKJAlbumFilePath;
/**
视频名
@return 返回视频名
*/
+ (NSString *)kj_getNewFileName;
/**
根据PHAsset获取图片
@param asset PHAsset
@param isSynchronous 同步-YES 异步-NO
@param completion 返回图片
*/
+ (void)kj_requestImageForAsset:(PHAsset *)asset
withSynchronous:(BOOL)isSynchronous
completion:(void (^)(UIImage *image))completion;
/**
根据PHAsset获取视频
@param kj_asset PHAsset
@param completion AVURLAsset
*/
+ (void)kj_requestVideoForAsset:(PHAsset *)kj_asset
completion:(void (^)(AVURLAsset *asset))completion;
/**
获取视频的缩略图方法
@param urlAsset 视频的本地路径
@param start 开始时间
@param timescale scale
@return 视频截图
*/
+ (UIImage *)kj_getScreenShotImageFromVideoPath:(AVURLAsset *)urlAsset
withStart:(CGFloat)start
withTimescale:(CGFloat)timescale ;
/**
图片保存到系统相册
@param image 图片
@param completionHandler 返回结果
*/
+ (void)kj_savePhotoToLibraryForImage:(UIImage *)image
completeHandler:(void(^)(NSString *localIdentifier, BOOL isSuccess))completionHandler;
/**
视频保存到系统相册
@param path 视频路径
@param completionHandler 返回结果
*/
+ (void)kj_saveVideoToLibraryForPath:(NSString *)path
completeHandler:(void(^)(NSString *localIdentifier, BOOL isSuccess))completionHandler;
/**
根据相册localid获取PHAsset
@param localIdentifier 相册id
@param completionHandler 返回PHAsset对象
*/
+ (void)kj_getAssetForLocalIdentifier:(NSString *)localIdentifier
completionHandler:(void(^)(PHAsset *kj_object))completionHandler;
/**
视频转码/压缩
@param asset AVAsset
@param presetName 视频质量(建议压缩上传使用AVAssetExportPresetMediumQuality根据需求设置)
@param savePath 保存的路径
@param completeBlock 返回状态
*/
+ (void)kj_compressedVideoAsset:(AVAsset *)asset
withPresetName:(NSString *)presetName
withNewSavePath:(NSURL *)savePath
withCompleteBlock:(void(^)(NSError *error))completeBlock;
/**
相册授权
@param ctrl 当前控制器
@param completeBlock 返回是否允许访问
*/
+ (void)kj_photoLibraryAuthorizationStatus:(UIViewController *)ctrl
completeBlock:(void (^)(BOOL allowAccess))completeBlock;
/**
相机授权
@param ctrl 当前控制器
@param completeBlock 返回是否允许访问
*/
+ (void)kj_cameraAuthorizationStatus:(UIViewController *)ctrl
completeBlock:(void (^)(BOOL allowAccess))completeBlock;
/**
麦克风授权
@param ctrl 当前控制器
@param completeBlock 返回是否允许访问
*/
+ (void)kj_requestRecordPermission:(UIViewController *)ctrl
completeBlock:(void (^)(BOOL allowAccess))completeBlock;
/**
授权提示弹出框(跳转到手机设置-应用权限)
@param ctrl 当前控制器
@param title 提示语
*/
+ (void)kj_authorizationAlert:(UIViewController *)ctrl
tipMessage:(NSString *)title;
/**
图片加滤镜
@param image 源图
@param filterName 滤镜名字
@return 合成滤镜后的图片
*/
+ (UIImage *)kj_imageProcessedUsingGPUImage:(UIImage *)image
withFilterName:(NSString *)filterName;
@end
这里本意是在使用完短视频的功能后,删除沙盒中的kjalbum的文件夹。好了,不在过多的赘述了,大家直接下载demo看看就知道了,下面给出效果展示gif,由于太大,压缩处理了,可能有点模糊: