iOS 相册,沙盒资源属性获取
我想说这是一篇很全的视频处理整理宝贝们,燥热起来吧。
技术探索一步一步来,怎么发现的又是怎么处理的,绕了多大一个弯道,弯道中学了多少知识。后续讲一些分片上传的东西
公司最近要做视频拍摄及上传。
技术摸索,本以为应用拍摄后的数据都是保存在相册,那么我的第一个思路就是获取相册资源并解读其中的数据。
【相册资源】
值得注意的一个框架AssetsLibrary.framework这个framework专门处理相册数据的
- ALAssetsLibrary:相册库
- ALAssetsGroup:分类组
- ALAsset:每个元素
上图是3者互存关系,下面上代码。
首先是获取所有ALAssetsGroup
@property (nonatomic,strong) ALAssetsLibrary *assetsLibrary;
@property (nonatomic,strong) NSMutableArray *groups;
- (ALAssetsLibrary *)assetsLibrary{
if (_assetsLibrary == nil) {
_assetsLibrary = [[ALAssetsLibrary alloc] init];
}
return _assetsLibrary;
}
- (NSMutableArray *)groups{
if (_groups == nil) {
_groups = [NSMutableArray array];
dispatch_async(dispatch_get_main_queue(), ^{
[self.assetsLibrary enumerateGroupsWithTypes:ALAssetsGroupAll usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
if(group){
[_groups addObject:group];
[self.tableView reloadData];
}
} failureBlock:^(NSError *error) {
UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:@"提示" message:@"访问相册失败" delegate:self cancelButtonTitle:@"确定" otherButtonTitles:nil];
[alertView show];
}];
});
}
return _groups;
}
然后对ALAssetsGroup进行分类展示,自定义GCMAssetModel,这个类用于保存assets信息。
- (void)setGroup:(ALAssetsGroup *)group{
_group = group;
[group enumerateAssetsUsingBlock:^(ALAsset *asset, NSUInteger index, BOOL *stop) {
if (asset == nil) return ;
GCMAssetModel *model = [[GCMAssetModel alloc] init];
if (![[asset valueForProperty:ALAssetPropertyType] isEqualToString:ALAssetTypePhoto]) {//不是图片
model.thumbnail = [UIImage imageWithCGImage:asset.thumbnail];
model.imageURL = asset.defaultRepresentation.url;
model.isImage = NO;
[self.assetModels addObject:model];
}else{
model.thumbnail = [UIImage imageWithCGImage:asset.thumbnail];
model.imageURL = asset.defaultRepresentation.url;
NSLog(@"%@",asset.defaultRepresentation.url);
model.isImage = YES;
[self.assetModels addObject:model];
}
}];
}
最后通过ALAssetsLibrary 方法使用assets的url获取资源信息
- (void)collectionView:(UICollectionView *)collectionView didSelectItemAtIndexPath:(NSIndexPath *)indexPath{
GCMAssetModel *model = self.assetModels[indexPath.item];
if (model.isImage == NO) {
MPMoviePlayerViewController* playerView = [[MPMoviePlayerViewController alloc] initWithContentURL:model.imageURL];
[self presentViewController:playerView animated:YES completion:nil];
}else{
cover = [[UIView alloc] initWithFrame:CGRectMake(0, 0, SCREEN_WIDTH, SCREEN_HEIGHT)];
UIBlurEffect *blur = [UIBlurEffect effectWithStyle:UIBlurEffectStyleLight];
UIVisualEffectView *effectview = [[UIVisualEffectView alloc] initWithEffect:blur];
effectview.frame = cover.frame;
[cover addSubview:effectview];
[self.view addSubview:cover];
bigImg = [[UIButton alloc] initWithFrame:CGRectMake(0, 0, SCREEN_WIDTH, SCREEN_HEIGHT)];
[bigImg addTarget:self action:@selector(removeBtn) forControlEvents:UIControlEventTouchUpInside];
[self.view addSubview:bigImg];
ALAssetsLibrary *lib = [[ALAssetsLibrary alloc] init];
[lib assetForURL:model.imageURL resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *assetRep = [asset defaultRepresentation];
CGImageRef imgRef = [assetRep fullResolutionImage];
UIImage *img = [UIImage imageWithCGImage:imgRef
scale:assetRep.scale
orientation:(UIImageOrientation)assetRep.orientation];
[bigImg setImage:img forState:UIControlStateNormal];
} failureBlock:^(NSError *error) {
NSLog(@"相册图片访问失败");
}];
}
}
【沙盒资源】
之后老板的需求是要通过app拍摄然后保存到本地(也就是沙盒),当时很蒙圈,开始查找哪个函数能指定存储位置,然而我没有查到,我就自己摸索其他方案,我就每次用我做的app拍摄视频发现每次拍摄后我都发现在tmp中都有临时存储的mov文件。将包下载下来后播放发现正是我们想要的。
视频临时存储位置.png知道了这个机制我就想通过url能否获取视频内部信息呢,然后我就找到了这个方法:
-(void)getMovInfoWithAVURLAsset
{
NSURL *fileUrl = [[NSBundle mainBundle] URLForResource:@"1481872850_wm" withExtension:@"MOV"];
AVURLAsset *movAsset = [AVURLAsset URLAssetWithURL:fileUrl options:nil];
for (NSString *format in [movAsset availableMetadataFormats]) {
NSLog(@"formatString: %@",format);
for (AVMetadataItem *metadataitem in [movAsset metadataForFormat:format]) {
NSLog(@"commonKey = %@ value = %@",metadataitem.commonKey,metadataitem.value);
if ([metadataitem.commonKey isEqualToString:@"make"]) {
NSString *make = (NSString *)metadataitem.value;
NSLog(@"make: %@",make);
}
else if([metadataitem.commonKey isEqualToString:@"software"])
{
NSString *software = (NSString *)metadataitem.value;
NSLog(@"software: %@",software);
}
else if([metadataitem.commonKey isEqualToString:@"model"])
{
NSString *model = (NSString *)metadataitem.value;
NSLog(@"model: %@",model);
}
else if([metadataitem.commonKey isEqualToString:@"creationDate"])
{
NSString *creationDate = (NSString *)metadataitem.value;
NSLog(@"creationDate: %@",creationDate);
}
}
}
CMTime durationTime = movAsset.duration;
CGFloat duration = CMTimeGetSeconds(durationTime);
NSLog(@"总时间:%f",duration);
// 视频截图
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc]initWithAsset:movAsset];
generator.appliesPreferredTrackTransform = YES;
CMTime time = CMTimeMakeWithSeconds(0, 30);
NSValue *timeValue = [NSValue valueWithCMTime:time];
[generator generateCGImagesAsynchronouslyForTimes:@[timeValue] completionHandler:^
(CMTime requestedTime, CGImageRef image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error)
{
if (result == AVAssetImageGeneratorSucceeded)
{
self.vedioImage.image = [UIImage imageWithCGImage:image];
// 成功 do something
}
else
{
// 失敗
}
}];
}
AVURLAsset 需要AVFoundation.framework,这样数据就全了。
推荐几个文章。
http://blog.csdn.net/b719426297/article/details/24312339
http://blog.csdn.net/newjerryj/article/details/7637047
http://blog.csdn.net/u011397277/article/details/52574996
http://www.jianshu.com/p/931f8e85dc37