iOS alternative to QTMovieLayer that has non-nil `contents`?
For getting video/audio data buffer from capture session or player session,
- Create AVCaptureVideoDataOutput / AVCaptureAudioDataOutput object.
- Confirm one of your to AVCaptureVideoDataOutputSampleBufferDelegate.
- Add AVCaptureVideoDataOutput to your Capture/Player session.
- Implement protocol methods. You will receive the CMSampleBufferRef object containing video/audio frames as the media is being captured/played in the captureOutput... method of AVCaptureVideoDataOutputSampleBufferDelegate.
CMSampleBufferRef object contains media frame data, timestamp information and the format description of media. You can then display this frame by converting it into CGImageRef and display it on any view.
You can also specify the desired frame compression format (or uncompressed frame format) in AVCaptureVideoDataOutput.videoSettings property.