在播放 iOS 时从 HLS 流(视频)中提取/录制音频
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/31700091/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Extract/Record Audio from HLS stream (video) while playing iOS
提问by Sajad Khan
I am playing HLS streams using AVPlayer. And I also need to record these streams as user presses record button. The approach I am using is to record audio and video separately then at the end merge these file to make the final video. And It is successful with remote mp4 files.
我正在使用 AVPlayer 播放 HLS 流。而且我还需要在用户按下录制按钮时录制这些流。我使用的方法是分别录制音频和视频,然后最后合并这些文件以制作最终视频。它是成功的远程 mp4 文件。
But now for the HLS (.m3u8) files I am able to record the video using AVAssetWriter but having problems with audio recording.
但是现在对于 HLS (.m3u8) 文件,我可以使用 AVAssetWriter 录制视频,但在录音时遇到问题。
I am using MTAudioProccessingTap to process the raw audio data and write it to a file. I followed thisarticle. I am able to record remote mp4 audio but its not working with HLS streams. Initially I wasn't able to extract the audio tracks from the stream using AVAssetTrack *audioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0];
我正在使用 MTAudioProccessingTap 处理原始音频数据并将其写入文件。我关注了这篇文章。我能够录制远程 mp4 音频,但它不能与 HLS 流一起使用。最初我无法使用 AVAssetTrack *audioTrack = [asset trackingWithMediaType:AVMediaTypeAudio][0]; 从流中提取音轨。
But I was able to extract the audioTracks using KVO to initialize the MTAudioProcessingTap.
但是我能够使用 KVO 提取音频轨道来初始化 MTAudioProcessingTap。
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context{
AVPlayer *player = (AVPlayer*) object;
if (player.status == AVPlayerStatusReadyToPlay)
{
NSLog(@"Ready to play");
self.previousAudioTrackID = 0;
__weak typeof (self) weakself = self;
timeObserverForTrack = [player addPeriodicTimeObserverForInterval:CMTimeMakeWithSeconds(1, 100) queue:nil usingBlock:^(CMTime time)
{
@try {
for(AVPlayerItemTrack* track in [weakself.avPlayer.currentItem tracks]) {
if([track.assetTrack.mediaType isEqualToString:AVMediaTypeAudio])
weakself.currentAudioPlayerItemTrack = track;
}
AVAssetTrack* audioAssetTrack = weakself.currentAudioPlayerItemTrack.assetTrack;
weakself.currentAudioTrackID = audioAssetTrack.trackID;
if(weakself.previousAudioTrackID != weakself.currentAudioTrackID) {
NSLog(@":::::::::::::::::::::::::: Audio track changed : %d",weakself.currentAudioTrackID);
weakself.previousAudioTrackID = weakself.currentAudioTrackID;
weakself.audioTrack = audioAssetTrack;
/// Use this audio track to initialize MTAudioProcessingTap
}
}
@catch (NSException *exception) {
NSLog(@"Exception Trap ::::: Audio tracks not found!");
}
}];
}
}
I am also keeping track of trackID to check if track is changed.
我还跟踪 trackID 以检查 track 是否已更改。
This is how I initialize the MTAudioProcessingTap.
这就是我初始化 MTAudioProcessingTap 的方式。
-(void)beginRecordingAudioFromTrack:(AVAssetTrack *)audioTrack{
// Configure an MTAudioProcessingTap to handle things.
MTAudioProcessingTapRef tap;
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalize;
OSStatus err = MTAudioProcessingTapCreate(
kCFAllocatorDefault,
&callbacks,
kMTAudioProcessingTapCreationFlag_PostEffects,
&tap
);
if(err) {
NSLog(@"Unable to create the Audio Processing Tap %d", (int)err);
NSError *error = [NSError errorWithDomain:NSOSStatusErrorDomain
code:err
userInfo:nil];
NSLog(@"Error: %@", [error description]);;
return;
}
// Create an AudioMix and assign it to our currently playing "item", which
// is just the stream itself.
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters
audioMixInputParametersWithTrack:audioTrack];
inputParams.audioTapProcessor = tap;
audioMix.inputParameters = @[inputParams];
_audioPlayer.currentItem.audioMix = audioMix;
}
But Now with this audio track MTAudioProcessingTap callbacks "Prepare" and "Process" are never called.
但是现在有了这个音轨 MTAudioProcessingTap 回调“Prepare”和“Process”永远不会被调用。
Is the problem with the audioTrack I am getting through KVO?
我通过 KVO 获得的音轨有问题吗?
Now I would really appreciate if some one can help me with this. Or can tell am I using the write approach to record HLS Streams?
现在,如果有人能帮助我解决这个问题,我将不胜感激。或者可以告诉我是否使用写入方法来记录 HLS 流?
采纳答案by Sajad Khan
I Found solution for this and using it in my app. Wanted to post it earlier but didn't get the time.
我找到了解决方案并在我的应用程序中使用它。想早点发,但没时间。
So to play with HLS you should have some knowledge what they are exactly. For that please see it here on Apple Website. HLS Apple
因此,要使用 HLS,您应该了解它们究竟是什么。为此,请在 Apple 网站上查看此处。 HLS 苹果
Here are the steps I am following. 1. First get the m3u8 and parse it. You can parse it using this helpful kit M3U8Kit. Using this kit you can get the M3U8MediaPlaylist or M3U8MasterPlaylist(if it is a master playlist) if you get the master playlist you can also parse it to get M3U8MediaPlaylist
这是我正在遵循的步骤。1.首先获取m3u8并解析。您可以使用这个有用的工具包M3U8Kit解析它。使用此套件,您可以获得 M3U8MediaPlaylist 或 M3U8MasterPlaylist(如果它是主播放列表)如果您获得主播放列表,您也可以解析它以获得 M3U8MediaPlaylist
(void) parseM3u8
{
NSString *plainString = [self.url m3u8PlanString];
BOOL isMasterPlaylist = [plainString isMasterPlaylist];
NSError *error;
NSURL *baseURL;
if(isMasterPlaylist)
{
M3U8MasterPlaylist *masterList = [[M3U8MasterPlaylist alloc] initWithContentOfURL:self.url error:&error];
self.masterPlaylist = masterList;
M3U8ExtXStreamInfList *xStreamInfList = masterList.xStreamList;
M3U8ExtXStreamInf *StreamInfo = [xStreamInfList extXStreamInfAtIndex:0];
NSString *URI = StreamInfo.URI;
NSRange range = [URI rangeOfString:@"dailymotion.com"];
NSString *baseURLString = [URI substringToIndex:(range.location+range.length)];
baseURL = [NSURL URLWithString:baseURLString];
plainString = [[NSURL URLWithString:URI] m3u8PlanString];
}
M3U8MediaPlaylist *mediaPlaylist = [[M3U8MediaPlaylist alloc] initWithContent:plainString baseURL:baseURL];
self.mediaPlaylist = mediaPlaylist;
M3U8SegmentInfoList *segmentInfoList = mediaPlaylist.segmentList;
NSMutableArray *segmentUrls = [[NSMutableArray alloc] init];
for (int i = 0; i < segmentInfoList.count; i++)
{
M3U8SegmentInfo *segmentInfo = [segmentInfoList segmentInfoAtIndex:i];
NSString *segmentURI = segmentInfo.URI;
NSURL *mediaURL = [baseURL URLByAppendingPathComponent:segmentURI];
[segmentUrls addObject:mediaURL];
if(!self.segmentDuration)
self.segmentDuration = segmentInfo.duration;
}
self.segmentFilesURLs = segmentUrls;
}
You can see that you will get the links to the .ts files from the m3u8 parse it.
您可以看到您将从 m3u8 解析中获得指向 .ts 文件的链接。
- Now download all the .ts file into a local folder.
- Merge these .ts files in to one mp4 file and Export. You can do that using this wonderful C library TS2MP4
- 现在将所有 .ts 文件下载到本地文件夹中。
- 将这些 .ts 文件合并为一个 mp4 文件并导出。你可以使用这个美妙的 C 库TS2MP4做到这一点
and then you can delete the .ts files or keep them if you need them.
然后您可以删除 .ts 文件或在需要时保留它们。
回答by Aamir
This is not good approach what you can do is to Parse M3U8 link .Then try to download segment files (.ts) . If you can get these file you can merge them to generate mp4 file.
这不是一个好方法,你可以做的是解析 M3U8 链接。然后尝试下载段文件 (.ts)。如果您可以获得这些文件,您可以将它们合并以生成 mp4 文件。