RE: [PATCHv2 6/6] ASoC: da7213: Add default clock handling
From: Adam Thomson
Date: Wed Nov 27 2019 - 06:33:02 EST
On 26 November 2019 17:51, Mark Brown wrote:
> On Tue, Nov 26, 2019 at 05:39:45PM +0000, Adam Thomson wrote:
> > On 26 November 2019 17:09, Mark Brown wrote:
>
> > > If we're special casing simple-card we're doing it wrong - there's
> > > nothing stopping any other machine driver behaving in the same way.
>
> > Ok, what's being proposed here is for the codec to automatically control the PLL
> > rather than leaving it to the machine driver as is the case right now. In the
> > possible scenario where this is done using a card level widget to enable/disable
>
> Wasn't the proposal to add a new mode where the machine driver *could*
> choose to delgate PLL selection to the CODEC driver rather than do so
> unconditionally?
Yes, but by default the assumption is the codec will do things automatically
until the machine driver calls the relevant PLL config code.....
>
> > the PLL we will conflict with that using the current suggested approach for the
> > da7213 driver, albeit with no real consequence other than configuring the PLL
> > twice the first time a stream is started. It's a case of how to determine who's
> > in control of the PLL here; machine driver or codec?
>
> The patch looked like the idea was that as soon as the machine driver
> configures the PLL the CODEC driver will stop touching it which looked
> like a reasonable way of doing it, you could also do it with an explicit
> SYSCLK value (which would have to be 0 for generic machine drivers to
> pick it up) but this works just as well and may even be better. Perhaps
> I misread it though.
...so yes you're right in your comment here. However the bias_level code is
called prior to the DAPM paths being sequenced. If a dedicated machine driver
were to want to control the PLL via a DAPM widget at the card level (which is
done for other codecs for example on Intel platforms), the code in the
bias_level() function of the codec to auto configure the PLL would always be
called the very first time a path is enabled before the relevant DAPM widget for
the card is called, so you'd have two configurations of the PLL in succession. I
don't expect that would cause issues, but it's not ideal. The other approach
would be to have something in the machine driver like a dai_link init function
to default the PLL config using the codec's PLL function which then prevents
any chance of auto configuration subsequently. However that requires the person
implementing the machine driver to know about this behaviour.
As I said it's a small thing and requires a specific use-case to occur, but
having the PLL configured twice for the very first stream in that scenario
seems messy. Regarding the SYSCLK approach you mention, I'm not clear how that
would work so I'm obviously missing something. If we had some init stage
indication though that auto PLL was required then we're golden.