How can I reduce the file size of a video created with UIImagePickerController? How can I reduce the file size of a video created with UIImagePickerController? ios ios

How can I reduce the file size of a video created with UIImagePickerController?


With AVCaptureSession and AVAssetWriter you can set the compression settings as such:

NSDictionary *settings = @{AVVideoCodecKey:AVVideoCodecH264,                           AVVideoWidthKey:@(video_width),                           AVVideoHeightKey:@(video_height),                           AVVideoCompressionPropertiesKey:                               @{AVVideoAverageBitRateKey:@(desired_bitrate),                                 AVVideoProfileLevelKey:AVVideoProfileLevelH264Main31, /* Or whatever profile & level you wish to use */                                 AVVideoMaxKeyFrameIntervalKey:@(desired_keyframe_interval)}};AVAssetWriterInput* writer_input = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:settings];

Edit: I guess if you insist on using the UIImagePicker to create the movie in the first place, you'll have to use AVAssetReader's copyNextSampleBuffer and AVAssetWriter's appendSampleBuffer methods to do the transcode.


yourfriendzak is right: Setting cameraUI.videoQuality = UIImagePickerControllerQualityTypeLow;isn't the solution here. The solution is to reduce the data rate, or bit rate, which is what jgh is suggesting.

I have, three methods. The first method handles the UIImagePicker delegate method:

// For responding to the user accepting a newly-captured picture or movie- (void) imagePickerController: (UIImagePickerController *) picker didFinishPickingMediaWithInfo: (NSDictionary *) info {// Handle movie captureNSURL *movieURL = [info objectForKey:                            UIImagePickerControllerMediaURL];NSURL *uploadURL = [NSURL fileURLWithPath:[[NSTemporaryDirectory() stringByAppendingPathComponent:[self randomString]] stringByAppendingString:@".mp4"]];// Compress movie first[self convertVideoToLowQuailtyWithInputURL:movieURL outputURL:uploadURL];}

The second method converts the video to a lower bitrate, not to lower dimensions.

- (void)convertVideoToLowQuailtyWithInputURL:(NSURL*)inputURL                               outputURL:(NSURL*)outputURL{//setup video writerAVAsset *videoAsset = [[AVURLAsset alloc] initWithURL:inputURL options:nil];AVAssetTrack *videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];CGSize videoSize = videoTrack.naturalSize;NSDictionary *videoWriterCompressionSettings =  [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:1250000], AVVideoAverageBitRateKey, nil];NSDictionary *videoWriterSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey, videoWriterCompressionSettings, AVVideoCompressionPropertiesKey, [NSNumber numberWithFloat:videoSize.width], AVVideoWidthKey, [NSNumber numberWithFloat:videoSize.height], AVVideoHeightKey, nil];AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput                                         assetWriterInputWithMediaType:AVMediaTypeVideo                                         outputSettings:videoWriterSettings];videoWriterInput.expectsMediaDataInRealTime = YES;videoWriterInput.transform = videoTrack.preferredTransform;AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:outputURL fileType:AVFileTypeQuickTimeMovie error:nil];[videoWriter addInput:videoWriterInput];//setup video readerNSDictionary *videoReaderSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey];AVAssetReaderTrackOutput *videoReaderOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoTrack outputSettings:videoReaderSettings];AVAssetReader *videoReader = [[AVAssetReader alloc] initWithAsset:videoAsset error:nil];[videoReader addOutput:videoReaderOutput];//setup audio writerAVAssetWriterInput* audioWriterInput = [AVAssetWriterInput                                        assetWriterInputWithMediaType:AVMediaTypeAudio                                        outputSettings:nil];audioWriterInput.expectsMediaDataInRealTime = NO;[videoWriter addInput:audioWriterInput];//setup audio readerAVAssetTrack* audioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];AVAssetReaderOutput *audioReaderOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:nil];AVAssetReader *audioReader = [AVAssetReader assetReaderWithAsset:videoAsset error:nil];[audioReader addOutput:audioReaderOutput];    [videoWriter startWriting];//start writing from video reader[videoReader startReading];[videoWriter startSessionAtSourceTime:kCMTimeZero];dispatch_queue_t processingQueue = dispatch_queue_create("processingQueue1", NULL);[videoWriterInput requestMediaDataWhenReadyOnQueue:processingQueue usingBlock: ^{     while ([videoWriterInput isReadyForMoreMediaData]) {         CMSampleBufferRef sampleBuffer;         if ([videoReader status] == AVAssetReaderStatusReading &&             (sampleBuffer = [videoReaderOutput copyNextSampleBuffer])) {             [videoWriterInput appendSampleBuffer:sampleBuffer];             CFRelease(sampleBuffer);         }         else {             [videoWriterInput markAsFinished];             if ([videoReader status] == AVAssetReaderStatusCompleted) {                 //start writing from audio reader                 [audioReader startReading];                 [videoWriter startSessionAtSourceTime:kCMTimeZero];                 dispatch_queue_t processingQueue = dispatch_queue_create("processingQueue2", NULL);                 [audioWriterInput requestMediaDataWhenReadyOnQueue:processingQueue usingBlock:^{                     while (audioWriterInput.readyForMoreMediaData) {                         CMSampleBufferRef sampleBuffer;                         if ([audioReader status] == AVAssetReaderStatusReading &&                             (sampleBuffer = [audioReaderOutput copyNextSampleBuffer])) {                            [audioWriterInput appendSampleBuffer:sampleBuffer];                                    CFRelease(sampleBuffer);                         }                         else {                             [audioWriterInput markAsFinished];                             if ([audioReader status] == AVAssetReaderStatusCompleted) {                                 [videoWriter finishWritingWithCompletionHandler:^(){                                     [self sendMovieFileAtURL:outputURL];                                 }];                             }                         }                     }                 }                  ];             }         }     } } ];}

When successful, the third method, sendMovieFileAtURL: is called, which uploads the compressed video at outputURL to the server.

Note that I've enabled ARC in my project, so you will have to add some release calls if ARC is turned off in yours.


On UImagePickerController you have a videoQuality property of UIImagePickerControllerQualityType type, and will be applied to recorded movies as well as to the ones that you picked picked from the library (that happens during transcoding phase).

Or if you have to deal with existent asset (file) not from the library you might want to look at these presets:

AVAssetExportPresetLowQualityAVAssetExportPresetMediumQualityAVAssetExportPresetHighestQuality

and

AVAssetExportPreset640x480AVAssetExportPreset960x540AVAssetExportPreset1280x720AVAssetExportPreset1920x1080

and pass one of them to initializer of AVAssetExportSession class. I'm afraid you have to play with those for your particular content as there is no precise description for what is low and medium quality or which quality will be used for 640x480 or for 1280x720 preset. The only useful information in the docs is following:

Export Preset Names for Device-Appropriate QuickTime Files You use these export options to produce QuickTime .mov files with video size appropriate to the current device.

The export will not scale the video up from a smaller size. Video is compressed using H.264; audio is compressed using AAC

Some devices cannot support some sizes.

Aside from that I do not remember having precise control over quality such as framerate or freeform size etc in AVFoundation

I was wrong, there is a way to tweak all parameters you mentions and it is AVAssetWriter indeed: How do I export UIImage array as a movie?

btw, here is a link to a similar question with a code sample: iPhone:Programmatically compressing recorded video to share?