ios capturing image using AVFramework
Add the following line
output.minFrameDuration = CMTimeMake(5, 1);
below the comment
// If you wish to cap the frame rate to a known value, such as 15 fps, set // minFrameDuration.
but above the
[session startRunning];
Edit
Use the following code to preview the camera output.
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];UIView *aView = self.view;CGRect videoRect = CGRectMake(0.0, 0.0, 320.0, 150.0);previewLayer.frame = videoRect; // Assume you want the preview layer to fill the view.[aView.layer addSublayer:previewLayer];
Edit 2:Ok fine..
Apple has provided a way to set the minFrameDuration here
So now, use the following code to set the frame duration
AVCaptureConnection *conn = [output connectionWithMediaType:AVMediaTypeVideo];if (conn.supportsVideoMinFrameDuration) conn.videoMinFrameDuration = CMTimeMake(5,1);if (conn.supportsVideoMaxFrameDuration) conn.videoMaxFrameDuration = CMTimeMake(5,1);
Be careful - callback from AVCaptureOutput is posted in dispatch queue you specified. I saw you perform UI updates from this callback, and that is wrong. You should perform them only in main queue. E.g.
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{ NSLog(@"captureOutput: didOutputSampleBufferFromConnection"); // Create a UIImage from the sample buffer data UIImage *image = [self imageFromSampleBuffer:sampleBuffer]; dispatch_async(dispatch_get_main_queue(), ^{ //< Add your code here that uses the image > [self.imageView setImage:image]; [self.view setNeedsDisplay]; }}
And here is a Swift version of imageFromSampleBuffer function:
func imageFromSampleBuffer(sampleBuffer:CMSampleBuffer!) -> UIImage { let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)! CVPixelBufferLockBaseAddress(imageBuffer, 0) let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer) let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer) let width = CVPixelBufferGetWidth(imageBuffer) let height = CVPixelBufferGetHeight(imageBuffer) let colorSpace = CGColorSpaceCreateDeviceRGB() let bitmapInfo:CGBitmapInfo = [.ByteOrder32Little, CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedFirst.rawValue)] let context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, bitmapInfo.rawValue) let quartzImage = CGBitmapContextCreateImage(context) CVPixelBufferUnlockBaseAddress(imageBuffer, 0) let image = UIImage(CGImage: quartzImage!) return image}
Above working for me with following video settings:
videoDataOutput = AVCaptureVideoDataOutput()videoDataOutput?.videoSettings = [kCVPixelBufferPixelFormatTypeKey:Int(kCVPixelFormatType_32BGRA)] videoDataOutput?.setSampleBufferDelegate(self, queue: queue)