开发者

when i export the video its not played

开发者 https://www.devze.com 2023-02-20 23:34 出处:网络
here i create the video is succeded and i combine the video and audio is merged in to the MOV Format and By Using the AVAssetExportSession the file is Exported, But When the file is played in media pl

here i create the video is succeded and i combine the video and audio is merged in to the MOV Format and By Using the AVAssetExportSession the file is Exported, But When the file is played in media player is not played it just displays the blank screen

here i attached the merging code for video and audio

-(void)combine:(NSString *)audiopathvalue videoURL:(NSString *)videopathValue;
{

   // 1. Create a AVMutableComposition

    CFAbsoluteTime currentTime = CFAbsoluteTimeGetCurrent(); //Debug purpose - used to calculate the total time taken
    NSError *error = nil;
    AVMutableComposition *saveComposition = [AVMutableComposition composition];


  //  2. Get the video and audio file path  
    NSString *tempPath = NSTemporaryDirectory();
    NSString *videoPath = videopathValue ;//<Video file path>;
    NSString *audioPath = audiopathvalue ;//<Audio file path>;;


    //3. Create the video asset 
    NSURL * url1 = [[NSURL alloc] initFileURLWithPath:videoPath];
    AVURLAsset *video = [AVURLAsset URLAssetWithURL:url1 options:nil];
    [url1 release];

   // 4. Get the AVMutableCompositionTrack for video and add the video track to it.
//        The method insertTimeRange: ofTrack: atTime: decides the what portion of the video to be added and also where the video track should appear in the final video created.
        AVMutableCompositionTrack *compositionVideoTrack = [saveComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    AVAssetTrack *clipVideoTrack = [[video tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
    [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [video duration]) ofTrack:clipVideoTrack atTime:kCMTimeZero error:nil];
    NSLog(@"%f %@",CMTimeGetSeconds([video duration]),error);



    //5. Create the Audio asset 

    NSLog(@"audioPath:%@",audioPath);
    NSURL * url2 = [[NSURL alloc] initFileURLWithPath:audioPath];
    AVURLAsset *audio = [AVURLAsset URLAssetWithURL:url2 options:nil];
    [url2 release];

    //6. Get the AVMutableCompositio开发者_StackOverflow中文版nTrack for audio and add the audio track to it.
        AVMutableCompositionTrack *compositionAudioTrack = [saveComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    AVAssetTrack *clipAudioTrack = [[audio tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
    [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [audio duration]) ofTrack:clipAudioTrack atTime:kCMTimeZero error:nil];
    NSLog(@"%f %@",CMTimeGetSeconds([audio duration]),error);

    //7. Get file path for of the final video.
        NSString *path = [tempPath stringByAppendingPathComponent:@"mergedvideo.MOV"];
    if([[NSFileManager defaultManager] fileExistsAtPath:path])
    {
        [[NSFileManager defaultManager] removeItemAtPath:path error:nil];
    }

    NSURL *url = [[NSURL alloc] initFileURLWithPath: path];


    //8. Create the AVAssetExportSession and set the preset to it.
    //The completion handler will be called upon the completion of the export.
    AVAssetExportSession *exporter = [[[AVAssetExportSession alloc] initWithAsset:saveComposition presetName:AVAssetExportPresetHighestQuality] autorelease];
    exporter.outputURL=url;
    exporter.outputFileType = @"com.apple.quicktime-movie";
    NSLog(@"file type %@",exporter.outputFileType);
    exporter.shouldOptimizeForNetworkUse = YES;






    [exporter exportAsynchronouslyWithCompletionHandler:^{

        switch ([exporter status]) {

            case AVAssetExportSessionStatusFailed:

                NSLog(@"Export failed: %@", [[exporter error] localizedDescription]);
                NSLog(@"ExportSessionError: %@", exporter.error);

                break;

            case AVAssetExportSessionStatusCancelled:

                NSLog(@"Export canceled");

                break;

            case AVAssetExportSessionStatusCompleted:
            {
                NSLog(@"Export Completed");
                ImageToAirPlayAppDelegate *theApp_iphone=(ImageToAirPlayAppDelegate *)[[UIApplication sharedApplication] delegate];
                [theApp_iphone call];
                break;
            }

            default:
                break;
        }

        //[exporter release];

    }];

in the video path it contains the series of images and in the audio path only one audio


The function (not in your code):

- (void) captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error

Try doing the processing there.

Gives you the outputFileURL that is the one you have to use in your mix. There is no reason to use an NSString in the function combine.

I also recommend you to use AVFileTypeQuickTimeMovie instead "com.apple.quicktime-movie". It is the same but easier to handle in case you want to experiment with other format.

To know the available formats just use

NSLog(@"%@", [exporter supportedFileTypes]);
0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号