I'm working on some large files, and I want to do this the fastest way possible.
First off, this is a 32 bit app meant to run on older xp computers as well as new windows 10 computers.
After experimenting, the fastest way is, of course, to use active available physical memory, and not let the system use the page file, etc.
So, I want to set aside the largest block of memory I can use. Using HeapAlloc, the largest block I can allocate is around 1.8G of memory.
This is not a problem on later computers, however I also tested on my oldest, slowest windows xp laptop with only 400M of physical memory.
For A test I started with a request for 2G, and decreased it until it stop failing in HeapAlloc.
On the old laptop, it happily allocated 1.8G of memory, even though it only has 400M of memory, and it all seems accessible and usable (even though unacceptably slow). Obviously it is allocating memory in the page file or elsewhere.
So, the question is, how can I assure that HeapAlloc will only allocate from actual physical memory?
Using GlobalMemoryStatusEx gives me a clue, yet it still feels like when approaching that limit it starts using the pagefile again.
Is there some other method of determining the actual maximum available/useable physical memory for heapalloc?
If it's for your personal use, just disable the page file and try again.
An interesting idea that didn't occur to me :)
It is meant to be eventually used by others however.
Thanks JJ. That worked as expected. It only allocated 337M for the block. But not a viable solution in the long run.
I can't find anywhere in the documentation for heapalloc that says it will allocate in the page file, nor how to stop it from doing so.
I guess I will just have to trust GlobalMemoryStatusEx and only try to get a couple of allocation units less than ullAvailPhys.
Have a serious look at VirtualLock :cool:
Jim,
You are doing it the hard way with so little memory, have you had a look in the guts of the laptop to see if it will take more memory. If its an old timer you may be able to find one for peanuts and use some of it if they are the same model.
JJ- Thanks, that looks very promising. I knew someone had a good idea to use :)
Hutch-
I dug this old laptop up just to be able to test with an older underpowered machine. I think I should be able to make the program sing even on lowly stuff like this. It just a matter of breaking it into smaller parts and keep from using slow physical disk writes. I am just having a hard time keeping everything constrained properly.
Jim,
You can also try GlobalAlloc() with the GMEM_FIXED flag to see if that does the job. If you use the GlobalMemoryStatusEx() to get an idea of how much memory is available and allocate less than is available. You can still do a lot of work with 200 meg of memory and if you are working on data that is larger, you can work out a way to stream the data so you are not using all of the available memory at once.
Quote from: hutch-- on April 28, 2020, 03:38:07 PM
.. you can work out a way to stream the data so you are not using all of the available memory at once.
When you CreateFile, there's a whole section on the FILE_FLAG_OVERLAPPED flag and syncronising read/writes.
Requires you to manage everything though.
could you try putting it in the data area [itll take ages to compile based on my experiences of doing this] preferably [under] the main window handle
Quote from: mikeburr on January 21, 2021, 05:23:26 AM
could you try putting it in the data area [itll take ages to compile based on my experiences of doing this] preferably [under] the main window handle
A known MASM bug. Try this workaround:
include \masm32\include\masm32rt.inc ; plain Masm32 for the fans of pure assembler
.data?
FatBuffer LABEL byte
ORG $+1677721600-1
db ?
FatBufferEnd LABEL byte
.code
start: mov edi, offset FatBuffer
mov ecx, offset FatBufferEnd - offset FatBuffer
push ecx
sar ecx, 20 ; bytes->megabytes
print str$(ecx), " MB reserved", 13, 10
pop ecx
mov al, "x"
rep stosb
MsgBox 0, "Debug?", "Test:", MB_YESNO
.if eax==IDYES
INT 3 ; have a look at the buffer in the debugger
mov eax, offset FatBuffer
mov edx, offset FatBufferEnd-64
.endif
exit
end start
Output:
1600 MB reserved (i.e. around 1.6GB - may fail on machines with low RAM)
Is it just better to use
QuoteBOOL GlobalMemoryStatusEx(
LPMEMORYSTATUSEX lpBuffer
);
typedef struct _MEMORYSTATUSEX {
DWORD dwLength;
DWORD dwMemoryLoad;
DWORDLONG ullTotalPhys;
DWORDLONG ullAvailPhys;
DWORDLONG ullTotalPageFile;
DWORDLONG ullAvailPageFile;
DWORDLONG ullTotalVirtual;
DWORDLONG ullAvailVirtual;
DWORDLONG ullAvailExtendedVirtual;
} MEMORYSTATUSEX, *LPMEMORYSTATUSEX;
and then use HeapAlloc or VirtualAlloc.
There is another approach worth trying but its no easier to do that any of the previous suggestions, create a memory mapped file with a sliding window of the data. The OS provides real memory for the sliding window size and usually the OS is more efficient that doing it manually.
You could just open the file and do enough read ahead and do the window slide yourself. Depending on what you need to do with the data you would set whatever size window you would need based of what available memory you have. With 400 meg available, you could probably set a window of 100 meg without running out of memory for other things.
The GlobalMemoryStatusEx() call that Timo suggested is fast enough to be called regularly to ensure you are not running out of memory.
Yes, I ended up just using GlobalMemoryStatusEx. I used either 2gig if available or whatever returned in .ullAvailPhys and it all worked out fine. Thanks everyone.