Re: GGI and cli/sti in X

Hasdi R Hashim (hasdi@umich.edu)
Mon, 30 Mar 1998 02:18:50 +0000 (/etc/localtime)


[Have problems sending this via dejanews, so I am resubmitting (and
reeditting) my original post manually]

In article <6fkbii$g3o$1@palladium.transmeta.com>,
torvalds@transmeta.com (Linus Torvalds) wrote:

> And people still wonder why I'm not too impressed with GGI? No, I'm not
> very impressed with people who think they have to rewrite the whole
> world in order to fix a few small problems.
[SNIP!]
> So please, people, consider just sending a bug-report to the XFree86
> team, and explaining the issue, and maybe even giving them a hint on how
> to fix it, for example. They may not listen to you, but _nobody_ will
> listen to you unless you can come up with better arguments.

Hello Linus,

How long has GGI discussion been going on? 3-4 years? We should
have resolved these 'small problems' by now. Frankly, I am surprised we
are still arguing on GGI vs X with respect to security. Security is not
the main problem with X, the problem is performance and resource
requirement. Take a look at netscape communicator, it flies on Windoze
16meg but you need 48 megs to get close on linux. Try to imagine what a
48meg Windoze can do? YMMV.

There a lot of design issues here. GGI ain't gonna make XFree86
more portable or more efficient. It ain't any more stable than X. Heck,
GGI is more difficult to debug unless you have a mechanism to debug kernel
while it is running. GGI, at least, is about optimizing one class of
applications: multimedia/games applications. If you want write
Yet-Another-Doom-clone, you have two options: X or svgalib.

With X, running the game requires you to pay the overhead of
handling events and updating sprites over socket connections. You also
have to pay to the high context switching cost (between the game and the X
server) unless you perform buffering (which can be annoying on the user
side). I know it works and it is proven to secure (you don't have to worry
about stupid priviledges) but you have to bear in mind the more lines you
can draw and the more polygons you can render, the better the game will
turn out to be. Why run x-linux-doom with 32meg when you can run
svga-linux-doom with 16 meg or DOS with 8meg? And people say Windows is
resource hungry.

[MINOR POINT: There is another minor problem with running games
with X. you are stuck with whatever resolution you can get. GGI should
allow you to open another VT with a different resolution]

With SVGALIB, you'll run into security priviledges problem. You
proposed the mapping technique. How do you make sure user application is
accessing the right memory and i/o space? This varies between graphics
adapters. Some way or another, this information has to kept in the kernel.
The easiest way to go about this problem is just make the program setuid
root and make program access the hardware directly like in DOS. If that
were the case, why do you need Linux to run games in the first place? We
might as well stick to DOS+DPMI.

What I propose is write a device driver that a user application
can open and the driver will make sure it maps the correct video memory
and accesses the chips correctly and in proper sequence. I believe this is
what GGI is all about. Alternatively, you can extend the capabilities of
/dev/console.

I am not saying GGI is the solution or it is any better than X.
The key thing here is better graphics performance with lower resource
requirement. With respect to this, I am sorry to say that Windoze beats
Linux hands down. This is even when Windows force all its application to
access the graphics hardware via secured interface (system call) rather
than writing directly to 0xA000.

As the creator and maintainer of linux, you have to make certain
design decisions on LINUX. You made LINUX heavily monolothic but curiously
refuse to put full graphics support into kernel-space (of all the hardware
you could have excluded). Most games and/or multimedia applications are
video intensive. That's why VLB and PCI-0 were created. That's why
special memory chip were made just for video (VRAM). That's why Intel
created AGP and Microsoft slapped the GDK to its NT "microkernel" (making
it 10MB). Why do you think they have hardware accelerators? From the looks
of it, the only person who is not kissing the graphics adapters butt is
you.

LINUX is your baby, Linus. You have to make the design decision.
GGI or no GGI, until LINUX has better graphics support, I doubt
multimedia/game apps will run on Linux better than 95/NT on a similar
configured machine. If you don't see the need to improving this class of
applications, it is up to you but you are severely limiting the scope of
linux users. Upgrading to a faster and mo-pawahful machine is not the
solution. For most people, the better solution is just use 'better'
operating systems like 95/NT.

Later

Hasdi
http://www-personal.umich.edu/~hasdi

-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@vger.rutgers.edu