ios iPhone 使用 AVFoundation 从视频中读取 UIimage(帧)

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/4199879/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-30 18:06:11  来源:igfitidea点击:

iPhone Read UIimage (frames) from video with AVFoundation

iosios4avfoundation

提问by matiasfh

Sorry for my english) Looking for information about read frames from a video with iPhone i found this project, http://www.codza.com/extracting-frames-from-movies-on-iphone/comment-page-1#comment-1116, but i also read somewhere that you can use AVFoundation to capture frames from a video for better performance..

抱歉我的英语)寻找有关使用 iPhone 从视频中读取帧的信息我找到了这个项目,http://www.codza.com/extracting-frames-from-movies-on-iphone/comment-page-1#comment -1116,但我也在某处读到你可以使用 AVFoundation 从视频中捕获帧以获得更好的性能..

But i can't find information of how i can do that...

但是我找不到有关如何做到这一点的信息...

Some idea?

有什么想法?

Thanks for reading

谢谢阅读

回答by h4xxr

You're talking about using the calls for generating what Apple calls thumbnail imagesfrom videos at specific times.

您正在谈论使用调用来生成 Apple在特定时间从视频中调用的缩略图

For an MPMoviePlayerController (what iOS uses to hold a video from a file or other source), there are two commands to do this. The first one generates a single thumbnail (image) from a movie at a specific point in time, and the second one generates a set of thumbnails for a time range.

对于 MPMoviePlayerController(iOS 用于保存来自文件或其他来源的视频),有两个命令可以执行此操作。第一个从特定时间点的电影生成单个缩略图(图像),第二个生成一组时间范围的缩略图。

This example gets an image at 10 seconds into a movie clip, myMovie.mp4:

本示例将 10 秒的图像放入影片剪辑myMovie.mp4

MPMoviePlayerController *movie = [[MPMoviePlayerController alloc]
        initWithContentURL:[NSURL URLWithString:@"myMovie.mp4"]];
UIImage *singleFrameImage = [movie thumbnailImageAtTime:10 
        timeOption:MPMovieTimeOptionExact];

Note that this performs synchronously - i.e. the user will be forced to wait while you get the screenshot.

请注意,这是同步执行的 - 即用户将被迫等待您获得屏幕截图。

The other option is to get a series of images from a movie, from an array of times:

另一种选择是从电影中从一系列时间中获取一系列图像:

MPMoviePlayerController *movie = [[MPMoviePlayerController alloc]
        initWithContentURL [NSURL URLWithString:@"myMovie.mp4"]];
NSNumber time1 = 10;
NSNumber time2 = 11;
NSNumber time3 = 12;
NSArray *times = [NSArray arrayWithObjects:time1,time2,time3,nil];
[movie requestThumbnailImagesAtTimes:times timeOption:MPMovieTimeOptionExact];

This second way will trigger a notification of type MPMoviePlayerThumbnailImageRequestDidFinishNotificationeach time a new image is generated. You can set up an observer to monitor this and process the image - I'll leave you to work that bit out on your own!

MPMoviePlayerThumbnailImageRequestDidFinishNotification每次生成新图像时,第二种方式都会触发类型通知。您可以设置一个观察者来监视它并处理图像 - 我会让您自己解决这个问题!

回答by Steve

回答by Avt

Swift 2 code to take frames with AVAssetImageGenerator:

使用 AVAssetImageGenerator 拍摄帧的 Swift 2 代码:

func previewImageForLocalVideo(url:NSURL) -> UIImage?
{
    let asset = AVAsset(URL: url)
    let imageGenerator = AVAssetImageGenerator(asset: asset)
    imageGenerator.appliesPreferredTrackTransform = true

    var time = asset.duration
    //If possible - take not the first frame (it could be completely black or white on camara's videos)
    time.value = min(time.value, 2)

    do {
        let imageRef = try imageGenerator.copyCGImageAtTime(time, actualTime: nil)
        return UIImage(CGImage: imageRef)
    }
    catch let error as NSError
    {
        print("Image generation failed with error \(error)")
        return nil
    }
}

回答by Hardik Thakkar

Here is code to get FPS images from video

这是从视频中获取 FPS 图像的代码

1) Import

1) 进口

#import <Photos/Photos.h>

2) in viewDidLoad

2) 在 vi​​ewDidLoad

    videoUrl = [NSURL fileURLWithPath:[[NSBundle mainBundle]pathForResource:@"VfE_html5" ofType:@"mp4"]];
    [self createImage:5]; // 5 is frame per second (FPS) you can change FPS as per your requirement.

3) Functions

3) 功能

-(void)createImage:(int)withFPS {
    AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoUrl options:nil];
    AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
    generator.requestedTimeToleranceAfter =  kCMTimeZero;
    generator.requestedTimeToleranceBefore =  kCMTimeZero;

    for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) *  withFPS ; i++){
        @autoreleasepool {
            CMTime time = CMTimeMake(i, withFPS);
            NSError *err;
            CMTime actualTime;
            CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
            UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image];
            [self savePhotoToAlbum: generatedImage]; // Saves the image on document directory and not memory
            CGImageRelease(image);
        }
    }
}

-(void)savePhotoToAlbum:(UIImage*)imageToSave {

    [[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
        PHAssetChangeRequest *changeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:imageToSave];
    } completionHandler:^(BOOL success, NSError *error) {
        if (success) {
            NSLog(@"sucess.");
        }
        else {
            NSLog(@"fail.");
        }
    }];
}

回答by Mike Lee

In Swift 4 this worked for me with some modifications, mainly changing the "at" parameter of imageGenerator.copyCGImage to a CMTime type:

在 Swift 4 中,这对我进行了一些修改,主要是将 imageGenerator.copyCGImage 的“at”参数更改为 CMTime 类型:

func showFrame(from file:String) {
    let file = file.components(separatedBy: ".")
    guard let path = Bundle.main.path(forResource: file[0], ofType:file[1]) else {
        debugPrint( "\(file.joined(separator: ".")) not found")
        return
    }
    let url = URL(fileURLWithPath: path)
    let image = previewImageForLocalVideo(url: url)
    let imgView = UIImageView(image: image)
    view.addSubview(imgView)
}    

func previewImageForLocalVideo(url:URL) -> UIImage? {
    let asset = AVAsset(url: url)
    let imageGenerator = AVAssetImageGenerator(asset: asset)
    imageGenerator.appliesPreferredTrackTransform = true
    let tVal = NSValue(time: CMTimeMake(12, 1)) as! CMTime
    do {
        let imageRef = try imageGenerator.copyCGImage(at: tVal, actualTime: nil)
        return UIImage(cgImage: imageRef)
    }
    catch let error as NSError
    {
        print("Image generation failed with error \(error)")
        return nil
    }
}

override func viewDidLoad() {
    super.viewDidLoad()
    showFrame(from:"video.mp4")
}

Source

来源