How to set decode pixel format in libavcodec? How to set decode pixel format in libavcodec? c c

How to set decode pixel format in libavcodec?


ffmpeg's AVCodec instances (static decoder "factory" objects) each define an array of pixel formats that they support, terminated by the value -1.

The AVCodecContext (decoder instance) objects have a callback function pointer called get_format: it is a function pointer in that structure.

This callback function is called, at some point in the codec initialization, with the AVCodec factory object's array of supported formats, and the callback is supposed to choose one of the formats from that array (kind of like "pick a card, any card") and return that value. The default implementation of this get_format callback is a function called avcodec_default_get_format. (This is installed avcodec_get_context_defaults2). This default function implements the "pick a format" logic quite simply: it chooses the first element of the array which isn't a hardware-accel-only pixel format.

If you want the codec to work with a different pixel format, what you can do is install your own get_format callback into the context object. However, the callback must return one of the values in the array (like choosing from a menu). It cannot return an arbitrary value. The codec will only support the formats that it specifies in the array.

Walk the array of available formats and pick the best one. If you're lucky, it's the exact one you actually want and the sws_scale function won't have to do pixel format conversion. (If, additionally, you don't request to scale or crop the picture, sws_scale should recognize that the conversion is a noop.)