Re: [PATCH v5 0/3] Add "link bpc" DRM property
From: Ville Syrjälä
Date: Wed Apr 01 2026 - 07:29:25 EST
On Wed, Apr 01, 2026 at 09:40:15AM +0100, Daniel Stone wrote:
> Hi Harry,
>
> On Tue, 31 Mar 2026 at 18:47, Harry Wentland <harry.wentland@xxxxxxx> wrote:
> > On 2026-03-31 08:50, Pekka Paalanen wrote:
> > > People who care about the picture quality down to these levels will
> > > likely want to know and learn about these techniques. They may also
> > > want to explicitly control them.
> > >
> > > In time, when these have been used enough in the wild, compositor
> > > developers will learn what makes a difference and what does not, so
> > > they will adjust their reporting to end users. The most important thing
> > > for the kernel is it offer an unambiguous and stable UAPI for these.
> > >
> > > Policy belongs in userspace.
> >
> > I don't like this as a blanket statement. There is a lot of policy that
> > intersects with HW nuances, whether it comes to power or otherwise.
> > Taking away driver vendor's abilities to optimize will hurt the Linux
> > ecosystem in the long run.
> >
> > IMO this needs to be evaluated on a case by case basis. There are
> > many places where it does make sense to give userspace a greater
> > say on policy, but we don't want to push driver (HW specific) logic
> > up into userspace.
>
> It's not something that's _just_ specific to a particular
> display-controller manufacturer or a particular IP generation though.
> It very much depends on the usecase.
>
> If you have a laptop and you're trying to give a presentation,
> applying dithering and/or DSC makes a lot of sense: you don't want
> your battery to die, and the projector's probably going to obliterate
> half the colour anyway, so might as well as go for the most efficient
> thing.
>
> If your laptop is plugged into your big display at home to write code,
> applying DSC to cram the highest possible resolution + refresh in
> would make sense. But if dithering only results in a marginal power
> saving, and your laptop is charging anyway - why bother degrading
> visual acuity?
>
> If you're a media player, then you're in a good position to know what
> would be good to go over the wire, because you know (& are possibly in
> control of) the format over what comes in in the first place.
>
> But everyone's tradeoffs are different, which is why sometimes the
> best choice is to ultimately leave it up to the user. If you dig into
> any media playback device (STBs running Android TV, Apple TV, Fire TV,
> et al), you'll see that all of them ultimately allow overrides for bpc
> / colour model / subsampling / etc. Those aren't just there for fun,
> but because they are usable to real people, and it's not possible for
> Amlogic or MediaTek or Rockchip or whoever to statically decide that a
> certain configuration is going to be best everywhere.
>
> Right now we have drivers making magic per-vendor/SKU decisions,
> without even so much as a feedback mechanism to userspace (unless you
> count debugfs, maybe) so it can even figure out what's going on, let
> alone control it. To properly support some of those usecases,
> userspace needs to be able to control what goes out on the wire, but
> as a first step, it just wants to be informed of what the driver even
> did with the properties we gave it.
>
> The end game of this isn't Weston logging something to stdout, it's to
> surface things to userspace so it can guide the kernel into making a
> good decision for usecases that may not be ones the silicon vendor
> decided was 'probably the best thing' however many years ago.
I think the problem here is that no one even tried to make a
real userspace implementation. So it's very hard to judge if this
new property is actually usable in the end, or if it will just end
up as historical baggage that we have to carry around forever.
IMO just having userspace log what the kernel said does not fulfill
the "userspace implementation must be ready before new drm uapi is
merged" requirement.
--
Ville Syrjälä
Intel