1

Ich habe 2 Portrait Videos. Eine, die von der Standard-iPhone-Kamera stammt und die zweite ist eine Aufzeichnung von meiner Anwendung mit UIIMagePickerController.CIFilter nicht richtig auf Video anwenden

wenn ich cifilter auf video 1 anwende filter dann perfekt anwenden, aber wenn ich filter auf zweites video dann video zoomed half video teil verschwommen und gestreckt und wenn ich export es sollte es drehen.

My-Code

AVAssetTrack *FirstAssetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 

CIFilter *filter = [CIFilter filterWithName:@"CIPhotoEffectInstant"]; 

player.currentItem.videoComposition = [AVVideoComposition videoCompositionWithAsset: asset applyingCIFiltersWithHandler:^(AVAsynchronousCIImageFilteringRequest *request){ 
    // Clamp to avoid blurring transparent pixels at the image edges 

    CIImage *source = [request.sourceImage imageByClampingToExtent]; 
    source = [source imageByApplyingTransform:FirstAssetTrack.preferredTransform]; 

    [filter setValue:source forKey:kCIInputImageKey]; 

    // Crop the blurred output to the bounds of the original image 
    CIImage *output = [filter.outputImage imageByCroppingToRect:request.sourceImage.extent]; 

    // Provide the filter output to the composition 
    [request finishWithImage:output context:nil]; 
}]; 

Dieser Code ist nicht für die zweite Video so einige Veränderungen für zweites Video gearbeitet Dies ist nicht die richtige Code, aber ich will seine Größe und Ausrichtung und nach Änderungen in der Ausrichtung seiner Arbeiten fein überprüfen wenn das Spiel in avplayer aber wenn exportieren gedreht

AVPlayer plays video composition result incorrectly

i diesen Link überprüfte wir beide nach demselben Problem so ändere ich meinen Code nach dieser aber noch nicht worki ng richtig

AVAssetTrack *FirstAssetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 
CIFilter *filter = [CIFilter filterWithName:@"CIPhotoEffectInstant"]; 


UIImageOrientation FirstAssetOrientation_ = UIImageOrientationUp; 
BOOL isFirstAssetPortrait_ = NO; 
CGAffineTransform firstTransform = FirstAssetTrack.preferredTransform; 
if(firstTransform.a == 0 && firstTransform.b == 1.0 && firstTransform.c == -1.0 && firstTransform.d == 0) { 
    FirstAssetOrientation_= UIImageOrientationRight; 
    isFirstAssetPortrait_ = YES; 
} 
if(firstTransform.a == 0 && firstTransform.b == -1.0 && firstTransform.c == 1.0 && firstTransform.d == 0) { 
    FirstAssetOrientation_ = UIImageOrientationLeft; 
    isFirstAssetPortrait_ = YES; 
} 
if(firstTransform.a == 1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == 1.0) { 
    FirstAssetOrientation_ = UIImageOrientationUp; 
} 
if(firstTransform.a == -1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == -1.0) { 
    FirstAssetOrientation_ = UIImageOrientationDown; 
} 

player.currentItem.videoComposition = [AVVideoComposition videoCompositionWithAsset:asset applyingCIFiltersWithHandler:^(AVAsynchronousCIImageFilteringRequest * _Nonnull request) { 
    // Step 1: get the input frame image (screenshot 1) 
    CIImage *sourceImage = request.sourceImage; 

    // Step 2: rotate the frame 
    CIFilter *transformFilter = [CIFilter filterWithName:@"CIAffineTransform"]; 
    [transformFilter setValue:sourceImage forKey: kCIInputImageKey]; 
    [transformFilter setValue: [NSValue valueWithCGAffineTransform: firstTransform] forKey: kCIInputTransformKey]; 
    sourceImage = transformFilter.outputImage; 
    CGRect extent = sourceImage.extent; 
    CGAffineTransform translation = CGAffineTransformMakeTranslation(-extent.origin.x, -extent.origin.y); 
    [transformFilter setValue:sourceImage forKey: kCIInputImageKey]; 
    [transformFilter setValue: [NSValue valueWithCGAffineTransform: translation] forKey: kCIInputTransformKey]; 
    sourceImage = transformFilter.outputImage; 

    // Step 3: apply the custom filter chosen by the user 
    extent = sourceImage.extent; 
    sourceImage = [sourceImage imageByClampingToExtent]; 
    [filter setValue:sourceImage forKey:kCIInputImageKey]; 
    sourceImage = filter.outputImage; 
    sourceImage = [sourceImage imageByCroppingToRect:extent]; 

    // make the frame the same aspect ratio as the original input frame 
    // by adding empty spaces at the top and the bottom of the extent rectangle 
    CGFloat newHeight = 1920 * 1920/extent.size.height; 
    CGFloat inset = (extent.size.height - newHeight)/2; 
    extent = CGRectInset(extent, 0, inset); 
    sourceImage = [sourceImage imageByCroppingToRect:extent]; 

    // scale down to the original frame size 
    CGFloat scale = 1920/newHeight; 
    CGAffineTransform scaleTransform = CGAffineTransformMakeScale(scale, scale*3.2); 
    [transformFilter setValue:sourceImage forKey: kCIInputImageKey]; 
    [transformFilter setValue: [NSValue valueWithCGAffineTransform: scaleTransform] forKey: kCIInputTransformKey]; 
    sourceImage = transformFilter.outputImage; 

    // translate the frame to make it's origin start at (0, 0) 
    CGAffineTransform translation1 = CGAffineTransformMake(1, 0, 0, 1, 0, 0); 
    [transformFilter setValue:sourceImage forKey: kCIInputImageKey]; 
    [transformFilter setValue: [NSValue valueWithCGAffineTransform: translation1] forKey: kCIInputTransformKey]; 
    sourceImage = transformFilter.outputImage; 

    // Step 4: finish processing the frame (screenshot 2) 
    [request finishWithImage:sourceImage context:nil]; 

}]; 

Antwort

2

Entfernen Sie einfach Ihre Transformationslinie, die das Problem verursacht.

Nur dieses:

source = [source imageByApplyingTransform:FirstAssetTrack.preferredTransform]; 

und überprüfen. :)

AVAssetTrack *FirstAssetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 

CIFilter *filter = [CIFilter filterWithName:@"CIPhotoEffectInstant"]; 

player.currentItem.videoComposition = [AVVideoComposition videoCompositionWithAsset: asset applyingCIFiltersWithHandler:^(AVAsynchronousCIImageFilteringRequest *request){ 
    // Clamp to avoid blurring transparent pixels at the image edges 

    CIImage *source = [request.sourceImage imageByClampingToExtent]; 

    [filter setValue:source forKey:kCIInputImageKey]; 

    // Crop the blurred output to the bounds of the original image 
    CIImage *output = [filter.outputImage imageByCroppingToRect:request.sourceImage.extent]; 

    // Provide the filter output to the composition 
    [request finishWithImage:output context:nil]; 
}]; 
Verwandte Themen