[PATCH 02 of 36] x86: add memory clobber to save/loadsegment

From: Jeremy Fitzhardinge
Date: Wed Jun 25 2008 - 00:37:27 EST


Add "memory" clobbers to savesegment and loadsegment, since they can
affect memory accesses and we never want the compiler to reorder them
with respect to memory references.

Signed-off-by: Jeremy Fitzhardinge <jeremy.fitzhardinge@xxxxxxxxxx>
---
include/asm-x86/system.h | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/include/asm-x86/system.h b/include/asm-x86/system.h
--- a/include/asm-x86/system.h
+++ b/include/asm-x86/system.h
@@ -157,14 +157,14 @@
"jmp 2b\n" \
".previous\n" \
_ASM_EXTABLE(1b,3b) \
- : :"r" (value), "r" (0))
+ : :"r" (value), "r" (0) : "memory")


/*
* Save a segment register away
*/
#define savesegment(seg, value) \
- asm volatile("mov %%" #seg ",%0":"=rm" (value))
+ asm("mov %%" #seg ",%0":"=rm" (value) : : "memory")

static inline unsigned long get_limit(unsigned long segment)
{


--
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at http://vger.kernel.org/majordomo-info.html
Please read the FAQ at http://www.tux.org/lkml/