I have a few questions. In a number of threads I keep seeing references to problems/tricks to getting dictionaries that are larger than 256MB. And this is on machines with GBs of memory.
What am I missing? Do the authors leave out the size of support structures? If yes why? Or do you need to support multiple copies of the dictionaries? IE maybe copying and reordering the contents of one dictionary into a revised version.
I have to ask, on my one GB machine I can easily allocate over 760MB in a single block (not using Windows mind you).
Is this a limit in Windows or is there something else I am missing.
PS. Since compression seems to go up with the size of dictionaries has anyone tried using a very large virtual memory or memory mapped files to test compression for large memory systems that are not available yet?