2013-06-14 14 views
9

Ich versuche Funktionalität zu implementieren, wie untenVideoaufzeichnung in iphone sdk programmatisch

Finale von Videoaufnahmen = "Aufnehmen eines Videos von Frontkamera + Nehmen Sie ein Audio von Video (die ich spiele über den Videoplayer) ".

Für mehr Verständnis sehen Sie bitte die Bildschirmaufnahme anhängen.

enter image description here

meine Blöcke von Codes verwenden, die unten angegeben: Am Ende, was ich bekommen, ist ein Video, aber ohne Ton.

Aber was ich will implementieren möchten, ist "Finale aufgezeichneten Video die Kombination sein muss von: 'A Video, das von meiner Frontkamera erfasst + Nehmen Sie nur Audio von Video-Datei, die ich Bin am spielen. '"

Kann mir jemand helfen oder führen, wie kann ich über die Funktionalität erreichen. Jede Hilfe wird geschätzt.

Dies ist mein Code.

"Recording" Button Click-Methode ist wie folgt:

-(void) startRecording 
{ 
    [self initCaptureSession]; 

    NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle] 
             pathForResource:@"video" 
             ofType:@"mp4"]]; 
    [self playMovieAtURL:url]; 

    [self startVideoRecording]; 
} 

"initCaptureSession": Mit dieser Methode i Aufnahme bin ein Video Frontkamera mit "AVCaptureSession"

-(void) initCaptureSession 
{ 
    NSLog(@"Setting up capture session"); 
    captureSession = [[AVCaptureSession alloc] init]; 

    NSLog(@"Adding video input"); 

    AVCaptureDevice *VideoDevice = [self frontFacingCameraIfAvailable ]; 

    if (VideoDevice) 
    { 
     NSError *error; 
     videoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:VideoDevice error:&error]; 
     if (!error) 
     { 
      if ([captureSession canAddInput:videoInputDevice]) 
      { 
       [captureSession addInput:videoInputDevice]; 
      } 
      else 
      { 
       NSLog(@"Couldn't add video input"); 
      } 
     } 
     else 
     { 
      NSLog(@"Couldn't create video input"); 
     } 
    } 
    else 
    { 
     NSLog(@"Couldn't create video capture device"); 
    } 


    NSLog(@"Adding audio input"); 
    AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeMuxed]; 
    NSError *error = nil; 
    AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error]; 
    if (audioInput) 
    { 
     [captureSession addInput:audioInput]; 
    } 


    NSLog(@"Adding movie file output"); 
    movieFileOutput = [[AVCaptureMovieFileOutput alloc] init]; 

    movieFileOutput.minFreeDiskSpaceLimit = 1024 * 1024; //<<SET MIN FREE SPACE IN BYTES FOR RECORDING TO CONTINUE ON A VOLUME 

    if ([captureSession canAddOutput:movieFileOutput]) 
     [captureSession addOutput:movieFileOutput]; 

    [self CameraSetOutputProperties];   //(We call a method as it also has to be done after changing camera) 

    NSLog(@"Setting image quality"); 
    [captureSession setSessionPreset:AVCaptureSessionPresetMedium]; 
    if ([captureSession canSetSessionPreset:AVCaptureSessionPreset640x480])  //Check size based configs are supported before setting them 
     [captureSession setSessionPreset:AVCaptureSessionPreset640x480]; 

    [captureSession startRunning]; 
} 

- (void) CameraSetOutputProperties 
{ 
    AVCaptureConnection *CaptureConnection=nil; 

    NSComparisonResult order = [[UIDevice currentDevice].systemVersion compare: @"5.0.0" options: NSNumericSearch]; 
    if (order == NSOrderedSame || order == NSOrderedDescending) { 
     // OS version >= 5.0.0 
     CaptureConnection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo]; 
    } else { 
     // OS version < 5.0.0 
     CaptureConnection = [self connectionWithMediaType:AVMediaTypeVideo fromConnections:[movieFileOutput connections]]; 

    } 

    //Set landscape (if required) 
    if ([CaptureConnection isVideoOrientationSupported]) 
    { 
     AVCaptureVideoOrientation orientation = AVCaptureVideoOrientationPortrait;// AVCaptureVideoOrientationLandscapeRight;  //<<<<<SET VIDEO ORIENTATION IF LANDSCAPE 
     [CaptureConnection setVideoOrientation:orientation]; 
    } 

    } 

"- (void) playMovieAtURL: (NSURL *) theURL" Mit dieser Methode bin ich playin g ein Video

-(void) playMovieAtURL: (NSURL*) theURL 
{ 

player = 
[[MPMoviePlayerController alloc] initWithContentURL: theURL ]; 
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil]; 

player.scalingMode = MPMovieScalingModeAspectFill; 
player.controlStyle = MPMovieControlStyleNone; 
[player prepareToPlay]; 

[[NSNotificationCenter defaultCenter] 
addObserver: self 
selector: @selector(myMovieFinishedCallback:) 
name: MPMoviePlayerPlaybackDidFinishNotification 
object: player]; 
player.view.frame=CGRectMake(10, 30, 300, 200); 
[self.view addSubview:player.view]; 

[player play]; 
} 

"startVideoRecording" mit dieser Methode ich die letzte Videoaufnahme gestartet haben.

- (void) startVideoRecording 
{ 
    //Create temporary URL to record to 
    NSString *outputPath = [[NSString alloc] initWithFormat:@"%@%@", NSTemporaryDirectory(), @"output.mov"]; 
    NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath]; 
    NSFileManager *fileManager = [NSFileManager defaultManager]; 
    if ([fileManager fileExistsAtPath:outputPath]) 
    { 
     NSError *error; 
     if ([fileManager removeItemAtPath:outputPath error:&error] == NO) 
     { 
      //Error - handle if requried 
      NSLog(@"file remove error"); 
     } 
    } 
    //Start recording 
    [movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self]; 

} 

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput 
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL 
     fromConnections:(NSArray *)connections 
       error:(NSError *)error 
{ 

    NSLog(@"didFinishRecordingToOutputFileAtURL - enter"); 

    BOOL RecordedSuccessfully = YES; 
    if ([error code] != noErr) 
    { 
     // A problem occurred: Find out if the recording was successful. 
     id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey]; 
     if (value) 
     { 
      RecordedSuccessfully = [value boolValue]; 
     } 
    } 
    if (RecordedSuccessfully) 
    { 
     //----- RECORDED SUCESSFULLY ----- 
     NSLog(@"didFinishRecordingToOutputFileAtURL - success"); 
     ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init]; 
     if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL]) 
     { 
      [library writeVideoAtPathToSavedPhotosAlbum:outputFileURL 
             completionBlock:^(NSURL *assetURL, NSError *error) 
      { 
       if (error) 
       { 
        NSLog(@"File save error"); 
       } 
       else 
       { 
        recordedVideoURL=assetURL; 
       } 
      }]; 
     } 
     else 
     { 

      NSString *assetURL=[self copyFileToDocuments:outputFileURL]; 
      if(assetURL!=nil) 
      { 
       recordedVideoURL=[NSURL URLWithString:assetURL]; 
      } 
     } 
    } 
} 
+0

Schauen Sie sich diesen Link an, wenn er hilft http://www.raywenderlich.com/13418/how-to-play-record-edit-videos-in-ios –

+0

+1 für nette Antwort Formatierung :) –

+0

Hi..ist das kann Live-RTSP-Streaming auch aufnehmen? – Anny

Antwort

4

// einige zusätzliche Code hinzu folgenden Methoden "1. Methode"

-(void) playMovieAtURL: (NSURL*) theURL 

    { 
     [player play]; 
     AVAudioSession *audioSession = [AVAudioSession sharedInstance]; 
     NSError *err = nil; 
     [audioSession setCategory :AVAudioSessionCategoryPlayAndRecord error:&err]; 
    if(err) 
     { 
     NSLog(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo]  description]); 
     return; 
    } 
     [audioSession setActive:YES error:&err]; 
     err = nil; 
    if(err){ 
     NSLog(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]); 
     return; 
    } 

     recordSetting = [[NSMutableDictionary alloc] init]; 

     [recordSetting setValue :[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey]; 
     [recordSetting setValue:[NSNumber numberWithFloat:16000.0] forKey:AVSampleRateKey]; 
     [recordSetting setValue:[NSNumber numberWithInt: 1] forKey:AVNumberOfChannelsKey]; 
     recorderFilePath = [NSString stringWithFormat:@"%@/MySound.caf", DOCUMENTS_FOLDER]; 
    NSLog(@"recorderFilePath: %@",recorderFilePath); 
    audio_url = [NSURL fileURLWithPath:recorderFilePath]; 
    err = nil; 
    NSData *audioData = [NSData dataWithContentsOfFile:[audio_url path] options: 0 error:&err]; 
    if(audioData) 
    { 
     NSFileManager *fm = [NSFileManager defaultManager]; 
     [fm removeItemAtPath:[audio_url path] error:&err]; 
    } 

    err = nil; 
    recorder = [[ AVAudioRecorder alloc] initWithURL:audio_url settings:recordSetting error:&err]; 
    if(!recorder) 
    { 
     NSLog(@"recorder: %@ %d %@", [err domain], [err code], [[err userInfo] description]); 
     UIAlertView *alert = 
     [[UIAlertView alloc] initWithTitle: @"Warning" 
            message: [err localizedDescription] 
            delegate: nil 
         cancelButtonTitle:@"OK" 
         otherButtonTitles:nil]; 
     [alert show]; 
     return; 
    } 

    //prepare to record 
    [recorder setDelegate:self]; 
    [recorder prepareToRecord]; 
    recorder.meteringEnabled = YES; 

    BOOL audioHWAvailable = audioSession.inputAvailable; 
    if (! audioHWAvailable) 
    { 
     UIAlertView *cantRecordAlert = 
     [[UIAlertView alloc] initWithTitle: @"Warning" 
            message: @"Audio input hardware not available" 
            delegate: nil 
         cancelButtonTitle:@"OK" 
         otherButtonTitles:nil]; 
     [cantRecordAlert show]; 
     return; 
    } 


} 

// 2. Methode

-(void) stopVideoRecording 

    { 
    [player.view removeFromSuperview]; 
    [player stop]; 
    [movieFileOutput stopRecording]; 

    AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_url options:nil]; 
    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:outputURL options:nil]; 

    mixComposition = [AVMutableComposition composition]; 

    AVMutableCompositionTrack *compositionCommentaryTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio 
                         preferredTrackID:kCMPersistentTrackID_Invalid]; 
    [compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration) 
             ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] 
             atTime:kCMTimeZero error:nil]; 

    AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo 
                        preferredTrackID:kCMPersistentTrackID_Invalid]; 
    [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) 
            ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] 
            atTime:kCMTimeZero error:nil]; 

    AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition 
                      presetName:AVAssetExportPresetPassthrough]; 

    AVAssetTrack *videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 
    [compositionVideoTrack setPreferredTransform:videoTrack.preferredTransform]; 
} 

// Finale Play Video

AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:mixComposition]; 
AVPlayer *player1 = [AVPlayer playerWithPlayerItem:playerItem]; 
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player1]; 
[playerLayer setFrame:CGRectMake(0, 0, 320, 480)]; 
[[[self view] layer] addSublayer:playerLayer]; 
playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; 
[player1 play]; 
player1.actionAtItemEnd = AVPlayerActionAtItemEndNone; 
+0

können Sie bitte den ganzen Code auf github ..? so dass es für uns einfach ist, eine genaue Demoversion zu erhalten – Anny

1

ich denke, das Sie ..

AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audioUrl options:nil]; 
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:videoUrl options:nil]; 

AVMutableComposition* mixComposition = [AVMutableComposition composition]; 

AVMutableCompositionTrack *compositionCommentaryTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio 
                        preferredTrackID:kCMPersistentTrackID_Invalid]; 
[compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration) 
            ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] 
            atTime:kCMTimeZero error:nil]; 

AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo 
                        preferredTrackID:kCMPersistentTrackID_Invalid]; 
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) 
           ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] 
           atTime:kCMTimeZero error:nil]; 

AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition 
                     presetName:AVAssetExportPresetPassthrough]; 

NSString* videoName = @"export.mov"; 

NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName]; 
NSURL *exportUrl = [NSURL fileURLWithPath:exportPath]; 

if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) 
{ 
    [[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil]; 
} 

_assetExport.outputFileType = @"com.apple.quicktime-movie"; 
DLog(@"file type %@",_assetExport.outputFileType); 
_assetExport.outputURL = exportUrl; 
_assetExport.shouldOptimizeForNetworkUse = YES; 

[_assetExport exportAsynchronouslyWithCompletionHandler: 
^(void) {  
      // your completion code here 
    }  
} 
]; 

Courtesy helfen können: - https://stackoverflow.com/a/3456565/1865424

und Sie können auch den Code überprüfen Video von Front-Kamera für die Aufnahme.

-(IBAction)cameraLibraryButtonClick:(id)sender{ 
    if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]) {    
     UIImagePickerController *videoRecorder = [[UIImagePickerController alloc]init]; 
     videoRecorder.delegate = self; 
     NSArray *sourceTypes = [UIImagePickerController availableMediaTypesForSourceType:videoRecorder.sourceType]; 
     NSLog(@"Available types for source as camera = %@", sourceTypes); 
     if (![sourceTypes containsObject:(NSString*)kUTTypeMovie]) { 
      UIAlertView *alert = [[UIAlertView alloc] initWithTitle:nil 
                  message:@"Device Not Supported for video Recording."                  delegate:self 
                cancelButtonTitle:@"Yes" 
                otherButtonTitles:@"No",nil]; 
      [alert show]; 
      [alert release]; 
      return; 
     } 
     videoRecorder.cameraDevice=UIImagePickerControllerCameraDeviceFront; 
     videoRecorder.sourceType = UIImagePickerControllerSourceTypeCamera; 
     videoRecorder.mediaTypes = [NSArray arrayWithObject:(NSString*)kUTTypeMovie];   
     videoRecorder.videoQuality = UIImagePickerControllerQualityTypeLow; 
     videoRecorder.videoMaximumDuration = 120; 

     self.imagePicker = videoRecorder;     
     [videoRecorder release]; 
     [self presentModalViewController:self.imagePicker animated:YES]; 
     newMedia = YES; 
    } 
    else { 
     [self displaysorceError]; 
    } 


} 

mit freundlicher Genehmigung: - https://stackoverflow.com/a/14154289/1865424

Wenn diese nicht für you..Let funktioniert mich know..But ich denke, das Ihnen helfen wird ..

+0

Möchten Sie Video-und Audio-Aufnahme in iPhone-Bildschirm zu implementieren, danke –

+1

dies wird Ihnen helfen. @ MacGeek – Shivaay