xf86-video-armada via UDL [was: Re: UDL's fbdev doesn't work for user-space apps]
From: Alexey Brodkin
Date: Mon Dec 04 2017 - 08:17:01 EST
Hi Jose,
On Mon, 2017-12-04 at 11:50 +0000, Jose Abreu wrote:
> Hi Alexey,
>
> On 04-12-2017 11:32, Alexey Brodkin wrote:
> >
> > My first [probably incorrect] assumption is Xserver requires fbdev (/dev/fbX)
> > and it cannot use DRI video card natively. Is that correct?
> >
> >
>
> Xserver can use DRI directly, you need to enable modesetting
> driver in Xorg config or use the designated driver for your card
> (if there is any).
Ok that makes sense. I didn't think about generic modesetting driver for Xserver.
And that indeed works. This is my xorg.conf:
----------------------->8----------------------
# cat /etc/X11/xorg.confÂ
Section "Device"
ÂÂÂÂÂÂÂÂIdentifierÂÂÂÂÂÂ"Driver0"
ÂÂÂÂÂÂÂÂScreenÂÂÂÂÂÂÂÂÂÂ0
ÂÂÂÂÂÂÂÂDriverÂÂÂÂÂÂÂÂÂÂ"modesetting"
ÂÂÂÂÂÂÂÂOptionÂÂÂÂÂÂÂÂÂÂ"kmsdev" "/dev/dri/card1"
EndSection
----------------------->8----------------------
I do see xclock is rendered fine.
Now I guess is getting closer to what I really need :)
In the end I wanted to get 3D rendered by Vivante GPU
to be displayed on UDL. My assumption was very simple - if IMX-DRM+Etnaviv
work fine it should be straight-forward to swap IMX-DRM bitstreamer with UDL
and we're golden.
That might be more a question to Lucas now.
I use xorg.conf as found here:
http://git.arm.linux.org.uk/cgit/xf86-video-armada.git/tree/conf/xorg-sample.conf?h=unstable-devel
That's what it has:
----------------------->8----------------------
Section "Device"
Identifier "Driver0"
Screen 0
Driver "armada"
# Support hotplugging displays?
# Option "Hotplug" "TRUE"
# Support hardware cursor if available?
# Option "HWCursor" "TRUE"
# Use GPU acceleration?
# Option "UseGPU" "TRUE"
# Provide Xv interfaces?
# Option "XvAccel" "TRUE"
# Prefer overlay for Xv (TRUE for armada-drm, FALSE for imx-drm)
# Option "XvPreferOverlay" "TRUE"
# Which accelerator module to load (automatically found if commented out)
# Option "AccelModule" "etnadrm_gpu"
# Option "AccelModule" "etnaviv_gpu"
# Support DRI2 interfaces?
# Option "DRI" "TRUE"
EndSection
----------------------->8----------------------
Indeed I uncommented all the lines and then it allows to see
for example glmark2-es2 working on Wandboard (that's exactly where
"imx-drm + etnaviv" combo is used).
But if I swap "imx-drm" to "udl" I don't see anything on my screen
(connected via UDL) even though Xserver seems to really start claiming the screen
(so I see it becomes black, effectively overriding whatever was there before) and
glmark benchmark prints results.
Maybe I'm missing some additional glue for UDL in "xf86-video-armada" except the simple one:
----------------------->8----------------------
--- a/src/armada_module.c
+++ b/src/armada_module.c
@@ -26,7 +26,7 @@
Â#define ARMADA_NAMEÂÂÂÂÂÂÂÂÂÂÂÂ"armada"
Â#define ARMADA_DRIVER_NAMEÂÂÂÂÂ"armada"
Â
-#define DRM_MODULE_NAMESÂÂÂÂÂÂÂ"armada-drm", "imx-drm"
+#define DRM_MODULE_NAMESÂÂÂÂÂÂÂ"armada-drm", "imx-drm", "udl"
Â#define DRM_DEFAULT_BUS_IDÂÂÂÂÂNULL
Â
Âstatic const char *drm_module_names[] = { DRM_MODULE_NAMES };
@@ -43,6 +43,11 @@ static SymTabRec ipu_chipsets[] = {
ÂÂÂÂÂÂÂÂ{ -1, NULL }
Â};
Â
+static SymTabRec udl_chipsets[] = {
+ÂÂÂÂÂÂÂ{ÂÂ0, "UDL" },
+ÂÂÂÂÂÂÂ{ -1, NULL }
+};
+
Âstatic const OptionInfoRec * const options[] = {
ÂÂÂÂÂÂÂÂarmada_drm_options,
ÂÂÂÂÂÂÂÂcommon_drm_options,
@@ -115,6 +120,8 @@ static void armada_identify(int flags)
ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂarmada_chipsets);
ÂÂÂÂÂÂÂÂxf86PrintChipsets(ARMADA_NAME, "Support for Freescale IPU",
ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂipu_chipsets);
+ÂÂÂÂÂÂÂxf86PrintChipsets(ARMADA_NAME, "Support DisplayLink USB2.0",
+ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂudl_chipsets);
Â}
----------------------->8----------------------
-Alexey