Re: [PATCH v5 0/3] Add "link bpc" DRM property
From: Nicolas Frattaroli
Date: Wed Apr 01 2026 - 08:53:44 EST
Chiming in here to basically agree with Daniel and underline his point with
some evidence.
On Wednesday, 1 April 2026 10:40:15 Central European Summer Time Daniel Stone wrote:
> Hi Harry,
>
> On Tue, 31 Mar 2026 at 18:47, Harry Wentland <harry.wentland@xxxxxxx> wrote:
> > On 2026-03-31 08:50, Pekka Paalanen wrote:
> > > People who care about the picture quality down to these levels will
> > > likely want to know and learn about these techniques. They may also
> > > want to explicitly control them.
> > >
> > > In time, when these have been used enough in the wild, compositor
> > > developers will learn what makes a difference and what does not, so
> > > they will adjust their reporting to end users. The most important thing
> > > for the kernel is it offer an unambiguous and stable UAPI for these.
> > >
> > > Policy belongs in userspace.
> >
> > I don't like this as a blanket statement. There is a lot of policy that
> > intersects with HW nuances, whether it comes to power or otherwise.
> > Taking away driver vendor's abilities to optimize will hurt the Linux
> > ecosystem in the long run.
> >
> > IMO this needs to be evaluated on a case by case basis. There are
> > many places where it does make sense to give userspace a greater
> > say on policy, but we don't want to push driver (HW specific) logic
> > up into userspace.
>
> It's not something that's _just_ specific to a particular
> display-controller manufacturer or a particular IP generation though.
> It very much depends on the usecase.
>
> If you have a laptop and you're trying to give a presentation,
> applying dithering and/or DSC makes a lot of sense: you don't want
> your battery to die, and the projector's probably going to obliterate
> half the colour anyway, so might as well as go for the most efficient
> thing.
>
> If your laptop is plugged into your big display at home to write code,
> applying DSC to cram the highest possible resolution + refresh in
> would make sense. But if dithering only results in a marginal power
> saving, and your laptop is charging anyway - why bother degrading
> visual acuity?
This kind of encourages me to say that the meaning of "bpc" here should
either be reduced by all compression (of which I see dithering as a
primitive variant of) or left at the uncompressed bpc for all compression.
I'm leaning towards the latter. 10bpc lossily compressed to 8bpc is likely
a better choice than plain 8bpc for visual clarity, so making them look the
same to userspace would result in some odd choices. At the same time, having
a separate way for userspace to know of any compression techniques applied
on the output would disambiguate that for those compositors that really care,
and would also mean we don't have to make subjective judgement calls for
anyone.
With regards to DSC for example, any vendor's decision to enable it by
default does not necessarily give us a good precedent for what side to
err on. amdgpu flips on DSC when it doesn't have to, and this has rubbed
some people wrong: https://gitlab.freedesktop.org/drm/amd/-/work_items/2043
The goal isn't so much to push driver logic into userspace, but to give
userspace a view into what the driver did, so that it can decide whether
it's happy or wants to try again differently. This means userspace isn't
ossifying on a set of parameters that made sense a decade ago; drivers can
still modify their decisions as they develop and hardware gains new
techniques.
>
> If you're a media player, then you're in a good position to know what
> would be good to go over the wire, because you know (& are possibly in
> control of) the format over what comes in in the first place.
>
> But everyone's tradeoffs are different, which is why sometimes the
> best choice is to ultimately leave it up to the user. If you dig into
> any media playback device (STBs running Android TV, Apple TV, Fire TV,
> et al), you'll see that all of them ultimately allow overrides for bpc
> / colour model / subsampling / etc. Those aren't just there for fun,
> but because they are usable to real people, and it's not possible for
> Amlogic or MediaTek or Rockchip or whoever to statically decide that a
> certain configuration is going to be best everywhere.
>
> Right now we have drivers making magic per-vendor/SKU decisions,
> without even so much as a feedback mechanism to userspace (unless you
> count debugfs, maybe) so it can even figure out what's going on, let
> alone control it. To properly support some of those usecases,
> userspace needs to be able to control what goes out on the wire, but
> as a first step, it just wants to be informed of what the driver even
> did with the properties we gave it.
>
> The end game of this isn't Weston logging something to stdout, it's to
> surface things to userspace so it can guide the kernel into making a
> good decision for usecases that may not be ones the silicon vendor
> decided was 'probably the best thing' however many years ago.
>
> Cheers,
> Daniel
>
Kind regards,
Nicolas Frattaroli