On Tue, Oct 16, 2007 at 02:50:33AM +0200, Nick Piggin wrote:On Mon, Oct 15, 2007 at 11:10:00AM +0200, Jarek Poplawski wrote:...
As a matter of fact it's not natural for me at all. I expected the
other direction, and I still doubt programmers' intentions could be
"automatically" predicted good enough, so IMHO, it's not for long.
Really? Consider the consequences if, instead of releasing this latest
document tightening consistency, Intel found that out of order loads
were worth 5% more performance and implemented them in their next chip.
The chip could be completely backwards compatible, but all your old code
would break, because it was broken to begin with (because it was outside
the spec).
I've different opinion on this: I expect any spec to describe current
implementation. Before issuing new models any changes of
implementation should be made public with proper margin of time. Then
system could be optimally adjusted to a real hardware, instead of
planned only, but possibly never realized (plus doing such not used
things with old means is usually more costly: lock vs. lfence). There
is still problem of specs' completness: there are probably often some
things unspecified which could brake on a new model, so never 100%
guarantee anyway.
IMO Intel did exactly the right thing from an engineering perspective,
and so did Linux to always follow the spec.
But, if you follow the spec - you don't follow the spec! Why do you
ignore so much this part of Intel's spec:
"This document contains information which Intel may change at any
time without notice. Do not finalize a design with this information."
Maybe it's a real Intel intention and not for lawyers only? (Btw, it
seems we have an example.)