xcode 在 iPhone 应用程序中合并两个视频文件

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/10071029/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-14 23:55:12  来源:igfitidea点击:

Merge Two Video files in iPhone Application

iphoneobjective-ciosxcode

提问by Gaurav Thummar

In One of my application i need to add some image in video. so i cut break video in two part and also make one video from that image. now i want to combine this three video file and make one video file. but i am not get any idea to combine this three video. i see some code over here. but that is not helpful to me. for break video and for make video from image i used below code now i want code to merge this all video.

在我的一个应用程序中,我需要在视频中添加一些图像。所以我将休息视频分为两部分,并从该图像制作了一个视频。现在我想合并这三个视频文件并制作一个视频文件。但我不知道将这三个视频结合起来。我在这里看到一些代码。但这对我没有帮助。对于中断视频和从图像制作视频,我现在使用下面的代码,我想要代码合并所有视频。

Any other idea for put current view screen in video file in between.

将当前视图屏幕放在两者之间的视频文件中的任何其他想法。

For break video file

用于中断视频文件

NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"Affagogato" ofType:@"mp4"]];
 AVURLAsset *anAsset = [[AVURLAsset alloc] initWithURL:url options:nil];




for(int i = 0; i < 2; i++) {
    AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]
                                           initWithAsset:anAsset presetName:AVAssetExportPresetLowQuality];
    NSString *filePath = nil;
    NSUInteger count = 0;
    do {
        filePath = NSTemporaryDirectory();

        NSString *numberString = count > 0 ? [NSString stringWithFormat:@"-%i", count] : @"";
        filePath = [filePath stringByAppendingPathComponent:[NSString stringWithFormat:@"Output-%@.mov", numberString]];
        count++;
    } while([[NSFileManager defaultManager] fileExistsAtPath:filePath]);      

    exportSession.outputURL = [NSURL fileURLWithPath:filePath];
    exportSession.outputFileType = AVFileTypeQuickTimeMovie;
    CMTimeRange range;
    if(i == 0){
        CMTime start = CMTimeMakeWithSeconds(0.0, 600);
        CMTime duration = CMTimeMakeWithSeconds(10.0, 600);
        range = CMTimeRangeMake(start, duration);
    }else{
        CMTime start = CMTimeMakeWithSeconds(10.0, 600);
        range = CMTimeRangeMake(start, anAsset.duration);
    }
    exportSession.timeRange = range;   

    [exportSession exportAsynchronouslyWithCompletionHandler:^
     {
         dispatch_async(dispatch_get_main_queue(), ^{
             [self exportDidFinish:exportSession Tag:i];
         });
     }]; 
}

Get video from Images

从图像中获取视频

CGRect rect=CGRectMake(0, 0, 320, 480);
view = [[UIView alloc]initWithFrame:rect];

NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = ([paths count] > 0) ? [paths objectAtIndex:0] : nil;
NSString *path = [documentsDirectory stringByAppendingPathComponent:[@"video2"  stringByAppendingString:@".mov"]];

CGSize size = self.view.frame.size;


NSMutableDictionary *attributes = [[NSMutableDictionary alloc]init];
[attributes setObject:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32ARGB] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];
[attributes setObject:[NSNumber numberWithUnsignedInt:320] forKey:(NSString*)kCVPixelBufferWidthKey];
[attributes setObject:[NSNumber numberWithUnsignedInt:480] forKey:(NSString*)kCVPixelBufferHeightKey];


NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                              [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
                                                          error:&error];


NSParameterAssert(videoWriter);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                               AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                               [NSNumber numberWithInt:size.height], AVVideoHeightKey,
                               nil];

AVAssetWriterInput* writerInput = [[AVAssetWriterInput
                                    assetWriterInputWithMediaType:AVMediaTypeVideo
                                    outputSettings:videoSettings] retain];


AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                 assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                 sourcePixelBufferAttributes:nil];


NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
[videoWriter addInput:writerInput];


//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];

CVPixelBufferRef buffer = NULL;

//convert uiimage to CGImage.

xPixel=0;
yPixel=250;

buffer = [self pixelBufferFromCGImage:[[UIImage imageNamed:@"1.jpeg"] CGImage]];



CVPixelBufferPoolCreatePixelBuffer (NULL, adaptor.pixelBufferPool, &buffer);
[adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];

for (int i = 0;i<2; i++)
{
    if([writerInput isReadyForMoreMediaData])
    {
        //NSLog(@"inside for loop %d",i);

        for(int pframetime=1;pframetime<=2;pframetime++)
        {  

            CMTime frameTime = CMTimeMake(pframetime,25);
            CMTime lastTime=CMTimeMake(i,1); //i is from 0 to 19 of the loop above
            CMTime presentTime=CMTimeAdd(lastTime, frameTime);

            if(i==0)
                buffer = [self pixelBufferFromCGImage:[[UIImage imageNamed:@"1.jpeg"] CGImage]];
            else
                buffer = [self pixelBufferFromCGImage:[[UIImage imageNamed:@"2.jpeg"] CGImage]];
            while ( ![writerInput isReadyForMoreMediaData] )
            {
                [NSThread sleepForTimeInterval:0.05];
            }

            [adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
            i++;
        }
        if(buffer)
            CVBufferRelease(buffer);
        //[NSThread sleepForTimeInterval:0.1];
    }
}
[writerInput markAsFinished];
[videoWriter finishWriting];
[videoPathArray addObject:path];

//Finish the session:

[videoWriter release];
[writerInput release];
CVPixelBufferPoolRelease(adaptor.pixelBufferPool);

For Merge video files i try this code but not useful here is some blank screen between video

对于合并视频文件,我尝试使用此代码,但这里没有用,视频之间有一些空白屏幕

   AVMutableComposition* mixComposition = [AVMutableComposition composition];

    NSString *documentsDirectoryPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];  

    NSString* video_inputFilePath1 = [videoPathArray objectAtIndex:1];
    NSURL*    video_inputFileUrl1 = [NSURL fileURLWithPath:video_inputFilePath1];

    NSString* video_inputFilePath2 = [videoPathArray objectAtIndex:0];
    NSURL*    video_inputFileUrl2 = [NSURL fileURLWithPath:video_inputFilePath2];

    NSString* video_inputFilePath3 = [videoPathArray objectAtIndex:2];
    NSURL*    video_inputFileUrl3 = [NSURL fileURLWithPath:video_inputFilePath3];

    NSString* outputFileName = @"outputFile.mov";
    NSString* outputFilePath = [NSString stringWithFormat:@"%@/%@",documentsDirectoryPath,outputFileName];

    NSURL*    outputFileUrl = [NSURL fileURLWithPath:outputFilePath];

    if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath]) 
        [[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];

    CMTime nextClipStartTime = kCMTimeZero;


    AVURLAsset* videoAsset1 = [[AVURLAsset alloc]initWithURL:video_inputFileUrl1 options:nil];
    AVURLAsset* videoAsset2 = [[AVURLAsset alloc]initWithURL:video_inputFileUrl2 options:nil];
    AVURLAsset* videoAsset3 = [[AVURLAsset alloc]initWithURL:video_inputFileUrl3 options:nil];


    CMTimeRange video_timeRange1 = CMTimeRangeMake(kCMTimeZero,videoAsset1.duration);
    AVMutableCompositionTrack *a_compositionVideoTrack1 = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    [a_compositionVideoTrack1 insertTimeRange:video_timeRange1 ofTrack:[[videoAsset1 tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];

    CMTimeRange video_timeRange3 = CMTimeRangeMake(nextClipStartTime,videoAsset3.duration);

    [a_compositionVideoTrack1 insertTimeRange:video_timeRange3 ofTrack:[[videoAsset3 tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:videoAsset1.duration error:nil];

    CMTimeRange video_timeRange2 = CMTimeRangeMake(nextClipStartTime,videoAsset1.duration);
    [a_compositionVideoTrack1 insertTimeRange:video_timeRange2 ofTrack:[[videoAsset2 tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:videoAsset1.duration error:nil];



    AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetLowQuality];   
    _assetExport.shouldOptimizeForNetworkUse = YES;
    _assetExport.outputFileType = @"com.apple.quicktime-movie";
    _assetExport.outputURL = outputFileUrl;

采纳答案by Sam

Try watching the video called "Working with Media in AV Foundation" in the apple developers portal. It tells you how to do what you are describing.

尝试在苹果开发者门户中观看名为“Working with Media in AV Foundation”的视频。它会告诉您如何执行您所描述的操作。

https://developer.apple.com/videos/wwdc/2011/

https://developer.apple.com/videos/wwdc/2011/