Re: [PATCH 1/2] usb: chipidea: add xilinx zynq platform data

From: Nathan Sullivan
Date: Thu Aug 27 2015 - 10:33:31 EST


On Thu, Aug 27, 2015 at 01:11:30PM +0530, punnaiah choudary kalluri wrote:
> Hi,
>
> On Thu, Aug 27, 2015 at 10:03 AM, Peter Chen <peter.chen@xxxxxxxxxxxxx> wrote:
> > On Thu, Aug 27, 2015 at 10:59:22AM +0530, sundeep subbaraya wrote:
> >> Hi,
> >>
> >>
> >> On Wed, Aug 26, 2015 at 8:57 PM, Nathan Sullivan <nathan.sullivan@xxxxxx> wrote:
> >> > The Xilinx Zynq udc does not need the CI_HDRC_DISABLE_STREAMING flag,
> >> > unlike the default platform data. Add platform data specific to the
> >> > Zynq udc.
> >> >
> >> > Based on a patch by the same name from the Xilinx vendor tree.
> >>
> >> I am that Xilinx guy who sent this patch :). It is in Xilinx tree as
> >> temporary fix and
> >> I did not debug further why UDC works only when streaming is enabled.
> >> Probably this is right time to post my question here.
> >> I was expecting like:
> >> Streaming disabled - both low bandwidth and high bandwidth systems
> >> should work fine
> >> Streaming enabled - only for high bandwidth systems
> >> but this is not the case, Zynq UDC works only when Streaming is enabled.
> >> Please correct me if I am wrong.
> >
> > You are right, stream mode disabled should work at anytime.
> > It is so strange why zynq UDC only works when stream mode is enabled.
>
> I am referring the section 8.5.2 in Synopsys usb 2.0 HS controllervDoc 2.20a,
> this is what it says about SDIS (streaming mode disable option)
>
> Before activating this mode, the user must check if the TX latency
> buffers per endpoint are able to
> accommodate at least one entire maximum size packet. The RX buffer
> size must, at least, double the TX
> buffer size per endpoint. To optimize the stream disable performance,
> system bus burst must be set as high
> as possible.
> When the stream disable mode is used, the burst size (VUSB_HS_RX_BURST
> and VUSB_HS_TX_BURST)
> must be a integer sub-multiple of the latency buffer size
> (VUSB_HS_RX_DEPTH for RX buffer and
> VUSB_HS_TX_CHAN for the TX buffer). If this is not respected the
> controller will not work properly in stream
> disable mode.
> The stream disable mode should just be used in situations where the
> available system bandwidth is low or the
> system bus access latency is high, in order to avoid underruns and
> overruns in the latency buffers. This works
> for all types of endpoints, except for ISO endpoints.
> Such a system can't ensure the real time support that the ISO
> endpoints require, so the ISO endpoints are not
> supported when the SDIS bit is set.
>
> Definitely we need to root cause why disable streaming mode is not
> working for zynq but from controller spec
> point of view it is possible that controller not work properly in
> stream disable mode.
>
> Regards,
> Punnaiah
>

Maybe the burst size isn't set correctly by default? It does say the controller
won't work correctly with stream disable set and an invalid burst size. Looks
like TX and RX burst both default to 16, per the Zynq manual.

With the stream disable bit set, the behvior we see on our hardware is
that priming just stops, with an outstanding transfer in memory marked
active in the status field by the controller. This happens at random, even
when doing single transfers at a time like with g_ether set to have a queue
size of 1. With SDIS clear everything works great. Given that the Zynq is not
bandwidth constrained, it seems like SDIS clear should be the default.

> >
> > Peter
> >>
--
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at http://vger.kernel.org/majordomo-info.html
Please read the FAQ at http://www.tux.org/lkml/