I'm having some problems getting a UIImage
from a CVPixelBuffer
. This is what I am trying:
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(imageDataSampleBuffer);
CFDictionaryRef attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, imageDataSampleBuffer, kCMAttachmentMode_ShouldPropagate);
CIImage *ciImage = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer options:(NSDictionary *)attachments];
if (attachments)
CFRelease(attachments);
size_t width = CVPixelBufferGetWidth(pixelBuffer);
size_t height = CVPixelBufferGetHeight(pixelBuffer);
if (width && height) { // test to make sure we have valid dimensions
UIImage *image = [[UIImage alloc] initWithCIImage:ciImage];
UIImageView *lv = [[UIImageView alloc] initWithFrame:self.view.frame];
lv.contentMode = UIViewContentModeScaleAspectFill;
self.lockedView = lv;
[lv release];
self.lockedView.image = image;
[image release];
}
[ciImage release];
height
and width
are both correctly set to the resolution of the camera. image
is created but I it seems to be black (or maybe transparent?). I can't quite understand where the problem is. Any ideas would be appreciated.