On Sat, Mar 11, 2017 at 07:31:18PM -0800, Steve Longerbeam wrote:
On 03/11/2017 10:59 AM, Russell King - ARM Linux wrote:
On Sat, Mar 11, 2017 at 10:54:55AM -0800, Steve Longerbeam wrote:
On 03/11/2017 10:45 AM, Russell King - ARM Linux wrote:
I really don't think expecting the user to understand and configure
the pipeline is a sane way forward. Think about it - should the
user need to know that, because they have a bayer-only CSI data
source, that there is only one path possible, and if they try to
configure a different path, then things will just error out?
For the case of imx219 connected to iMX6, it really is as simple as
"there is only one possible path" and all the complexity of the media
interfaces/subdevs is completely unnecessary. Every other block in
the graph is just noise.
The fact is that these dot graphs show a complex picture, but reality
is somewhat different - there's only relatively few paths available
depending on the connected source and the rest of the paths are
completely useless.
I totally disagree there. Raw bayer requires passthrough yes, but for
all other media bus formats on a mipi csi-2 bus, and all other media
bus formats on 8-bit parallel buses, the conersion pipelines can be
used for scaling, CSC, rotation, and motion-compensated de-interlacing.
... which only makes sense _if_ your source can produce those formats.
We don't actually disagree on that.
...and there are lots of those sources! You should try getting out of
your imx219 shell some time, and have a look around! :)
If you think that, you are insulting me. I've been thinking about this
from the "big picture" point of view. If you think I'm only thinking
about this from only the bayer point of view, you're wrong.
Given what Mauro has said, I'm convinced that the media controller stuff
is a complete failure for usability, and adding further drivers using it
is a mistake.
I counter your accusation by saying that you are actually so focused on
the media controller way of doing things that you can't see the bigger
picture here.
So, tell me how the user can possibly use iMX6 video capture without
resorting to opening up a terminal and using media-ctl to manually
configure the pipeline. How is the user going to control the source
device without using media-ctl to find the subdev node, and then using
v4l2-ctl on it. How is the user supposed to know which /dev/video*
node they should be opening with their capture application?
If you can actually respond to the points that I've been raising about
end user usability, then we can have a discussion.