marek@foundmoney.com wrote:
>
> I have a 3.4 Gig file(on NT) that I need to import into a database on
> unix machine, RH6.2, knowing there's a limitation of 2 GIG how would you
>
> suggest I do this ?,
Use a kernel supporting "large files". Two possibilities are:
2.4 kernel
2.2.x kernel with the LFS patch. http://www.scyld.com has a patch that
successfully applies from 2.2.12 through 2.2.17. That patch introduced
an error in fcntl processing (I submitted a patch to Scyld, and I can
provide it to you), but I don't know if they've applied it yet. The LFS
patch is stable, I've run terabytes through it.
> Then can MYSQL hold that much data with the file limitation(does it
> automatically split up files) ?
I have *read* that mysql 3.23 allows splitting large tables over
multiple files; the same source also claimed "but indexes don't work
using that mechnism." Don't know the details or correctness of this
statement.
-- David N. Lombard MSC.Software- To unsubscribe from this list: send the line "unsubscribe linux-kernel" in the body of a message to majordomo@vger.rutgers.edu Please read the FAQ at http://www.tux.org/lkml/
This archive was generated by hypermail 2b29 : Mon Jul 31 2000 - 21:00:34 EST