iOS 将视频帧提取为图像
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/8291727/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
iOS extracting video frames as images
提问by MB.
I'm using UIImagePicker to allow the user to create a video and then trim it. I need to split that video into multiple frames and let the user choose one of them.
我正在使用 UIImagePicker 来允许用户创建视频然后修剪它。我需要将该视频拆分为多个帧并让用户选择其中之一。
In order to show the frames I likely must convert them to UIImage. How can I do this? I must use AVFoundation but I couldn't find a tutorial on how to get & convert the frames.
为了显示帧,我可能必须将它们转换为 UIImage。我怎样才能做到这一点?我必须使用 AVFoundation,但我找不到有关如何获取和转换帧的教程。
Should I do the image capture with AVFoundation too? If so do I have to implementing trimming myself?
我也应该使用 AVFoundation 进行图像捕获吗?如果是这样,我必须自己实施修剪吗?
采纳答案by Robin
I think the answer in this question is what you are looking for.
我认为这个问题的答案就是你正在寻找的。
iPhone Read UIimage (frames) from video with AVFoundation.
iPhone 使用 AVFoundation 从视频中读取 UIimage(帧)。
There are 2 methods specified by the accepted answer. You can use either one according to your requirement.
接受的答案指定了两种方法。您可以根据需要使用其中一种。
回答by Hardik Thakkar
Here is code to get FPS images from video
这是从视频中获取 FPS 图像的代码
1) Import
1) 进口
#import <Photos/Photos.h>
2) in viewDidLoad
2) 在 viewDidLoad
videoUrl = [NSURL fileURLWithPath:[[NSBundle mainBundle]pathForResource:@"VfE_html5" ofType:@"mp4"]];
[self createImage:5]; // 5 is frame per second (FPS) you can change FPS as per your requirement.
3) Functions
3) 功能
-(void)createImage:(int)withFPS {
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoUrl options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.requestedTimeToleranceAfter = kCMTimeZero;
generator.requestedTimeToleranceBefore = kCMTimeZero;
for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) * withFPS ; i++){
@autoreleasepool {
CMTime time = CMTimeMake(i, withFPS);
NSError *err;
CMTime actualTime;
CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image];
[self savePhotoToAlbum: generatedImage]; // Saves the image on document directory and not memory
CGImageRelease(image);
}
}
}
-(void)savePhotoToAlbum:(UIImage*)imageToSave {
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *changeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:imageToSave];
} completionHandler:^(BOOL success, NSError *error) {
if (success) {
NSLog(@"sucess.");
}
else {
NSLog(@"fail.");
}
}];
}
回答by CrimeZone
You also can use the lib VideoBufferReader (see on GitHub), based on AVFoundation.
您还可以使用基于 AVFoundation 的 lib VideoBufferReader(参见 GitHub)。