xcode 使用 Handbrake CLI 进行视频到图像序列

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/12857597/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-15 01:47:56  来源:igfitidea点击:

Video to image sequence with Handbrake CLI

xcodecocoacommand-linevideo-processing

提问by Andre

I'm trying to create a Mac App for internal use, that grabs a movie file specified by the user, converts it to a specific format and saves every frame as an image on the hard drive. I've got the converting part done, using the awesome Handbrake CLI.

我正在尝试创建一个供内部使用的 Mac 应用程序,它抓取用户指定的电影文件,将其转换为特定格式并将每一帧作为图像保存在硬盘驱动器上。我已经使用很棒的 Handbrake CLI 完成了转换部分。

Now I'm trying to find a way to save each frame as an image, but I can't find a way of doing it.

现在我试图找到一种方法将每一帧保存为图像,但我找不到这样做的方法。

It seems ffmpeg has a command to extract frames into images. The following pulls a single frame five seconds in:

似乎 ffmpeg 有一个命令可以将帧提取到图像中。以下内容在 5 秒内拉出单帧:

ffmpeg -i "infile.mp4" -r 1 -ss 00:00:05 -t 00:00:01 -vframes 1 -f image2 -y "image.jpg"

However, I'd rather use QTKit or the Handbrake CLI, so I won't have to add both ffmpeg and Handbrake to the app.

但是,我更愿意使用 QTKit 或 Handbrake CLI,因此我不必将 ffmpeg 和 Handbrake 添加到应用程序中。

Any ideas would be much appreciated!

任何想法将不胜感激!

采纳答案by Andre

So, I've found out for sure that you can't do it with Handbrake, after a discussion on their IRC channel.

因此,在对他们的 IRC 频道进行讨论后,我确定您不能使用 Handbrake 来做到这一点。

However, here's a relatively painless way of doing it with ffmpeg:

但是,这是使用 ffmpeg 进行操作的一种相对轻松的方法:

NSNotificationCenter *defaultCenter = [NSNotificationCenter defaultCenter];
outputPipe = [NSPipe pipe];
taskOutput = [outputPipe fileHandleForReading];
current_task = [[NSTask alloc] init];

NSString *resourcePath = [[NSBundle mainBundle] resourcePath];

NSString *stillName = [NSString stringWithFormat:@"%@_still%d.png", [MY_FILE substringToIndex:([draggedFile length]-4)], MY_STILL_NUMBER];

[current_task setLaunchPath: [NSString stringWithFormat:@"%@/ffmpeg", resourcePath]];

// save just frame at current time
// by leaving -ss before the -i, we enable direct indexing within ffmpeg, which saves a still in about a second, rather than a minute or so on a long video file
NSArray *arguments = [NSArray arrayWithObjects:@"-ss", currentFrameTime, @"-i", draggedFile, @"-vframes", @"1", @"-f", @"image2", stillName, nil];

[current_task setArguments:arguments];
[current_task setStandardInput:[NSPipe pipe]];
[current_task setStandardOutput:outputPipe];
[current_task setStandardError:outputPipe];

[defaultCenter addObserver:self selector:@selector(saveTaskCompleted:) name:NSTaskDidTerminateNotification object:current_task];

[current_task launch];
[taskOutput readInBackgroundAndNotify];

So, i'm using a task for Handbrake (to convert video) and then another task to save a still. I could use just ffmpeg, but I like Handbrake, so I'll leverage the 2 this way.

所以,我正在使用手刹任务(转换视频),然后使用另一个任务来保存静止图像。我可以只使用 ffmpeg,但我喜欢 Handbrake,所以我会以这种方式利用 2。

Hope this helps anyone out there.

希望这可以帮助那里的任何人。

回答by Elftheitroados

Try this (Replace 25 by the number of frame by second in your video)

试试这个(用视频中的每秒帧数替换 25)

ffmpeg -i infile.mp4 -r 25 -f image2 /tmp/image-%4d.jpg

Then use a NSTask to run this command from your xcode project

然后使用 NSTask 从您的 xcode 项目运行此命令

Download FFMPEG O.11.1 for mac osx

下载 FFMPEG O.11.1 for mac osx

Download FFMPEG git version for mac osx

下载适用于 mac osx 的 FFMPEG git 版本