On Thu, Jun 06, 2013 at 01:08:56PM +0300, Illia Smyrnov wrote:On 06/05/2013 03:03 PM, Mark Brown wrote:
Why is this defined for slaves? Surely the size of the FIFO in the
controller is a property of the controller not the slave?
According to OMAP TRM [1] the FIFO buffer can be used by only one
channel at a time. If several channels are selected and several FIFO
enable bit fields are set to 1, the controller forces the buffer not
to be used.
The controller ought to be able to figure this out for itself. As a
first pass just grabbing the FIFO on a first come first served basis
will probably work well most of the time, the device would have to be
very active for it to constantly be doing transfers on all channels.
If there's more contention than that we probably ought to be looking at
how we handle this in general, it seems like we'd have more problems
than just the FIFO to worry about.
If there are several slaves on the controller we must select which
of slaves will use the FIFO for SPI transfers. Also, optimal FIFO
A single controller is only going to be able to talk to one slave at
once, everything on the bus except chip select is shared.
size is heavily dependent of the SPI transfers length specific for
certain slave.
The transfer length doesn't seem like something that we want to be
encoding in DT, particularly not indirectly - it is obviously readily
available at runtime, variable during runtime (eg, firmware download may
do large transfers on a device that only does small transfers most of
the time) and is something that updates to the drivers could change.