Re: [PATCH v2 3/8] media: videobuf2: Add a module param to limit vb2 queue buffer storage

From: Hans Verkuil
Date: Wed May 31 2023 - 04:30:48 EST


On 5/31/23 10:03, Laurent Pinchart wrote:
> On Wed, May 31, 2023 at 08:36:59AM +0200, Hans Verkuil wrote:
>> On 21/03/2023 11:28, Benjamin Gaignard wrote:
>>> Add module parameter "max_vb_buffer_per_queue" to be able to limit
>>> the number of vb2 buffers store in queue.
>>>
>>> Signed-off-by: Benjamin Gaignard <benjamin.gaignard@xxxxxxxxxxxxx>
>>> ---
>>> drivers/media/common/videobuf2/videobuf2-core.c | 15 +++------------
>>> include/media/videobuf2-core.h | 11 +++++++++--
>>> 2 files changed, 12 insertions(+), 14 deletions(-)
>>>
>>> diff --git a/drivers/media/common/videobuf2/videobuf2-core.c b/drivers/media/common/videobuf2/videobuf2-core.c
>>> index ae9d72f4d181..f4da917ccf3f 100644
>>> --- a/drivers/media/common/videobuf2/videobuf2-core.c
>>> +++ b/drivers/media/common/videobuf2/videobuf2-core.c
>>> @@ -34,6 +34,8 @@
>>> static int debug;
>>> module_param(debug, int, 0644);
>>>
>>> +module_param(max_vb_buffer_per_queue, ulong, 0644);
>>
>> There is no MODULE_PARM_DESC here? Please add. I see it is not there for
>> the debug param either, it should be added for that as well.
>
> Would this be the right time to consider resource accounting in V4L2 for
> buffers ? Having a module parameter doesn't sound very useful, an
> application could easily allocate more buffers by using buffer orphaning
> (allocating buffers, exporting them as dmabuf objects, and freeing them,
> which leaves the memory allocated). Repeating allocation cycles up to
> max_vb_buffer_per_queue will allow allocating an unbounded number of
> buffers, using all the available system memory. I'd rather not add a
> module argument that only gives the impression of some kind of safety
> without actually providing any value.

Does dmabuf itself provide some accounting mechanism? Just wondering.

More specific to V4L2: I'm not so sure about this module parameter either.
It makes sense to have a check somewhere against ridiculous values (i.e.
allocating MAXINT buffers), but that can be a define as well. But otherwise
I am fine with allowing applications to allocate buffers until the memory
is full.

The question is really: what is this parameter supposed to do? The only
thing it does is to sanitize unlikely inputs (e.g. allocating MAXINT buffers).

I prefer that as a define, to be honest.

I think it is perfectly fine for users to try to request more buffers than
memory allows. It will just fail in that case, not a problem.

And if an application is doing silly things like buffer orphaning, then so
what? Is that any different than allocating memory and not freeing it?
Eventually it will run out of memory and crash, which is normal.

Regards,

Hans