I agree that 0 is a bad value. But so is infinity. There should be
some mixing but not a lot. You say "kept to a minimum". Is that
actively done or already happens by itself. Hopefully the later which
would be just splendid.
But would mapping a random 4K page out of a file then consume 64k?
That sounds like an awfull lot of internal fragmentation. I hope the
unaligned bits and pices get put into a slab or something as you
It is too bad that existing amd64 CPUs only allow such large physical
pages. But it kind of makes sense to cut away a full level or page
tables for the next bigger size each.
rtorrent, Xemacs/gnus, bash, xterm, zsh, make, gcc, galeon and the
I would mostly be concerned how rtorrents totaly random access of
mmapped files negatively impacts such a 64k page system.