2010-10-26 35 views
13

J'ai créé avec succès la vidéo à partir d'images en utilisant le code suivantComment utiliser CVPixelBufferPool avec AVAssetWriterInputPixelBufferAdaptor sur iPhone?

-(void)writeImageAsMovie:(NSArray *)array toPath:(NSString*)path size:(CGSize)size duration:(int)duration 
{ 
    NSError *error = nil; 
    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL: 
            [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie 
                   error:&error]; 
    NSParameterAssert(videoWriter); 

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: 
            AVVideoCodecH264, AVVideoCodecKey, 
            [NSNumber numberWithInt:size.width], AVVideoWidthKey, 
            [NSNumber numberWithInt:size.height], AVVideoHeightKey, 
            nil]; 
    AVAssetWriterInput* writerInput = [[AVAssetWriterInput 
             assetWriterInputWithMediaType:AVMediaTypeVideo 
             outputSettings:videoSettings] retain]; 

    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor 
                assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput 
                sourcePixelBufferAttributes:nil]; 
    NSParameterAssert(writerInput); 
    NSParameterAssert([videoWriter canAddInput:writerInput]); 
    [videoWriter addInput:writerInput]; 


    //Start a session: 
    [videoWriter startWriting]; 
    [videoWriter startSessionAtSourceTime:kCMTimeZero]; 

    CVPixelBufferRef buffer = NULL; 
    buffer = [self pixelBufferFromCGImage:[[array objectAtIndex:0] CGImage]]; 
    [adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero]; 

    //Write samples: 
    for (int i = 0;i<[array count]; i++) 
    { 
     if([writerInput isReadyForMoreMediaData]) 
     { 
      NSLog(@"inside for loop %d",i); 
      CMTime frameTime = CMTimeMake(1, 20); 

      CMTime lastTime=CMTimeMake(i, 20); //i is from 0 to 24 of the loop above 

      CMTime presentTime=CMTimeAdd(lastTime, frameTime); 

      buffer = [self pixelBufferFromCGImage:[[array objectAtIndex:i] CGImage]]; 

      [adaptor appendPixelBuffer:buffer withPresentationTime:presentTime]; 

     } 
     else 
     { 
      NSLog(@"error"); 
      i--; 
     } 
    } 
    NSLog(@"outside for loop"); 

    //Finish the session: 
    [writerInput markAsFinished]; 
    [videoWriter finishWriting]; 
} 

Ici, je l'ai utilisé CVPixelBufferRef. Au lieu de cela, je veux utiliser le CVPixelBufferPoolRef en conjonction avec AVAssetWriterInputPixelBufferAdaptor.

Quelqu'un peut-il fournir un exemple que je peux déboguer et utiliser?

Répondre

14

Vous passez nul « sourcePixelBufferAttributes », à cause duquel le pool de mémoire tampon de pixels ne sera pas créé:

AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:nil]; 

passent en revanche quelques attributs, par exemple:

NSDictionary *bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys: 
[NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil]; 

vous pouvez utiliser le pool pour créer les tampons de pixels, comme:

CVPixelBufferPoolCreatePixelBuffer (NULL, adaptor.pixelBufferPool, &pixelBuffer); 
+0

Merci pour la réponse, mais je l'ai déjà résolu ce problème de la même manière que vous l'avez dit ... –

+3

La performance est-elle notablement (ou mesurable) meilleure en utilisant le pool de tampons et l'adaptateur? – kevlar

+1

@kevlar parlant d'expérience personnelle, l'utilisation d'une piscine est une différence jour et nuit. Littéralement 1 LOC fixé la fuite de mémoire: D – jakenberg

4

Au lieu d'utiliser "f ou "utiliser ce code:

dispatch_queue_t mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL); 
[writerInput requestMediaDataWhenReadyOnQueue:mediaInputQueue usingBlock:^{ 

    CVPixelBufferRef buffer = NULL; 
    buffer = [self pixelBufferFromCGImage:[[array objectAtIndex:0] CGImage]]; 
    CVPixelBufferPoolCreatePixelBuffer (NULL, adaptor.pixelBufferPool, &buffer); 

    [adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero]; 
    int i = 1; 
    while (writerInput.readyForMoreMediaData) { 
     NSLog(@"inside for loop %d",i); 
     CMTime frameTime = CMTimeMake(1, 20); 

     CMTime lastTime=CMTimeMake(i, 20); //i is from 0 to 19 of the loop above 

     CMTime presentTime=CMTimeAdd(lastTime, frameTime); 

     if (i >= [array count]) { 
      buffer = NULL; 
     }else { 
       buffer = [self pixelBufferFromCGImage:[[array objectAtIndex:i] CGImage]]; 
     }   
     //CVBufferRetain(buffer); 

     if (buffer) { 
      // append buffer 
      [adaptor appendPixelBuffer:buffer withPresentationTime:presentTime]; 
      i++; 
     } else { 
      // done! 

      //Finish the session: 
      [writerInput markAsFinished]; 
      [videoWriter finishWriting];     

      CVPixelBufferPoolRelease(adaptor.pixelBufferPool); 
      [videoWriter release]; 
      [writerInput release]; 
      NSLog (@"Done"); 
      [imageArray removeAllObjects];    
      break; 
     } 
    } 
}]; 
+0

Merci pour l'aide, mais j'ai déjà résolu le problème et le projet a également terminé ... –

+1

Il y a aussi un bug dans cette routine en ce que l'écrivain peut devenir «pas prêt» avant que vous parcourir sur votre tableau. Si vous déplacez votre compteur hors du bloc, et nettoyez l'entrée de la boucle à la clé de n'importe quel index, cela devrait aller. –

6

@Atulkumar V. Jain: génial! bonne chance ^^ @ Brian: vous êtes merci droit, je raison et je reçois ce travail est maintenant ici le code de travail (si quelqu'un d'autre besoin :-))

CVPixelBufferRef buffer = NULL; 
buffer = [self pixelBufferFromCGImage:[[imagesArray objectAtIndex:0] CGImage]]; 
CVPixelBufferPoolCreatePixelBuffer (NULL, adaptor_.pixelBufferPool, &buffer); 

[adaptor_ appendPixelBuffer:buffer withPresentationTime:kCMTimeZero]; 

__block UInt64 convertedByteCount = 0; 
dispatch_queue_t mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL); 
static int i = 1; 
int frameNumber = [imagesArray count]; 

[writerInput requestMediaDataWhenReadyOnQueue:mediaInputQueue usingBlock:^{ 
    while (1){ 
     if (i == frameNumber) { 
      break; 
     } 
     if ([writerInput isReadyForMoreMediaData]) { 

      CVPixelBufferRef sampleBuffer = [self pixelBufferFromCGImage:[[imagesArray objectAtIndex:i] CGImage]]; 
      NSLog(@"inside for loop %d",i); 
      CMTime frameTime = CMTimeMake(1, 20); 

      CMTime lastTime=CMTimeMake(i, 20); //i is from 0 to 19 of the loop above 

      CMTime presentTime=CMTimeAdd(lastTime, frameTime);  

     if (sampleBuffer) { 
       [adaptor_ appendPixelBuffer:sampleBuffer withPresentationTime:presentTime]; 
       i++; 
       CFRelease(sampleBuffer); 
      } else { 
       break; 
      } 
     } 
    } 
    NSLog (@"done"); 
    [writerInput markAsFinished]; 
    [videoWriter finishWriting];  

    CVPixelBufferPoolRelease(adaptor_.pixelBufferPool); 
    [videoWriter release]; 
    [writerInput release];  
    [imagesArray removeAllObjects]; 


}]; 
4

Je l'ai tout travail !

Voici l'exemple de lien de code: [email protected]: RudyAramayo/AVAssetWriterInputPixelBufferAdaptorSample.git

Voici le code dont vous avez besoin:

- (void) testCompressionSession 
{ 
    CGSize size = CGSizeMake(480, 320); 

    NSString *betaCompressionDirectory = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.m4v"]; 

    NSError *error = nil; 

    unlink([betaCompressionDirectory UTF8String]); 

    //----initialize compression engine 
    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:betaCompressionDirectory] 
                  fileType:AVFileTypeQuickTimeMovie 
                   error:&error]; 
    NSParameterAssert(videoWriter); 
    if(error) 
     NSLog(@"error = %@", [error localizedDescription]); 

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey, 
             [NSNumber numberWithInt:size.width], AVVideoWidthKey, 
             [NSNumber numberWithInt:size.height], AVVideoHeightKey, nil]; 
    AVAssetWriterInput *writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings]; 

    NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: 
                   [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil]; 

    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput 
                                 sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary]; 
    NSParameterAssert(writerInput); 
    NSParameterAssert([videoWriter canAddInput:writerInput]); 

    if ([videoWriter canAddInput:writerInput]) 
     NSLog(@"I can add this input"); 
    else 
     NSLog(@"i can't add this input"); 

    [videoWriter addInput:writerInput]; 
    [videoWriter startWriting]; 
    [videoWriter startSessionAtSourceTime:kCMTimeZero]; 

    //--- 
    // insert demo debugging code to write the same image repeated as a movie 

    CGImageRef theImage = [[UIImage imageNamed:@"Lotus.png"] CGImage]; 

    dispatch_queue_t dispatchQueue = dispatch_queue_create("mediaInputQueue", NULL); 
    int __block frame = 0; 

    [writerInput requestMediaDataWhenReadyOnQueue:dispatchQueue usingBlock:^{ 
     while ([writerInput isReadyForMoreMediaData]) 
     { 
      if(++frame >= 120) 
      { 
       [writerInput markAsFinished]; 
       [videoWriter finishWriting]; 
       [videoWriter release]; 
       break; 
      } 

      CVPixelBufferRef buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:theImage size:size]; 
      if (buffer) 
      { 
       if(![adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(frame, 20)]) 
        NSLog(@"FAIL"); 
       else 
        NSLog(@"Success:%d", frame); 
       CFRelease(buffer); 
      } 
     } 
    }]; 

    NSLog(@"outside for loop"); 

} 

- (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size 
{ 
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: 
          [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, 
          [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil]; 
    CVPixelBufferRef pxbuffer = NULL; 
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width, size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, &pxbuffer); 
    // CVReturn status = CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &pxbuffer); 

    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); 

    CVPixelBufferLockBaseAddress(pxbuffer, 0); 
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer); 
    NSParameterAssert(pxdata != NULL); 

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB(); 
    CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8, 4*size.width, rgbColorSpace, kCGImageAlphaPremultipliedFirst); 
    NSParameterAssert(context); 

    CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image); 

    CGColorSpaceRelease(rgbColorSpace); 
    CGContextRelease(context); 

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0); 

    return pxbuffer; 
} 
+0

Voulez-vous supprimer la ligne 'CVPixelBufferPoolCreatePixelBuffer' dans' pixelBufferFromCGImage'? Ou avez-vous décidé de ne pas utiliser 'CVPixelBufferPoolCreatePixelBuffer'? – taber

+0

J'étais tellement perdue au moment où j'ai écrit ceci, c'était une compilation de nombreux jeux de code différents que j'ai essayé de ré-ingénieur ... s'il vous plaît ne me dérange pas certains des quarks comme les choses commentées, j'appris encore/testing ... Je dois améliorer l'échantillon et soumettre plus de code AVFoundation bientôt. – Orbitus007

+1

N'utilisez pas ce code - c'est faux. Pour utiliser un pool, vous devez appeler uniquement CVPixelBufferPoolCreatePixelBuffer. – AlexeyVMP