And what I'm wondering is, whatever order the steps occur, are they
grouped together so that time is minimized between the first and last
step for a given file? Wouldn't that produce the most consistency in the
face of failure (lacking a journal)?
If anybody wants to flame me for saying something obvious or stupid,
please go ahead, but add some constructive explanations too. I'm trying to
grok the fs stuff, but I'm just beginning.
Darrin
-- ----------------------------------------------------------------------------- boo@stilyagin.com | PGP key at http://stilyagin.com/~boo/pgp-pub.txt ----------------------------------------------------------------------------- All seems condemned in the long run to approximate a state akin to Gaussian noise. -- James Martin -----------------------------------------------------------------------------
- To unsubscribe from this list: send the line "unsubscribe linux-kernel" in the body of a message to majordomo@vger.rutgers.edu Please read the FAQ at http://www.tux.org/lkml/