From: Raymond T. <toy...@gm...> - 2023-05-22 02:00:35
|
On 5/21/23 14:07, Henry Baker wrote: > Hi all: > > I want to be able to test some extremely large Maxima calculations, > so I've been playing with my computer to see how large a memory > space a program could utilize before it started paging itself to death. > > I just ran some tests on my current laptop with 16GB of RAM running > WSL2 (Windows Subsystem for Linux) on top of Windows 11. > > My tests involved little more than a "Hello World" C program which also > did a large malloc and then initialized the data. > > Actually, I used 'aligned_alloc' instead of 'malloc', but I expect that they > both do roughly the same things. > > I was able to allocate a single object up to 2^32 *bytes* long (i.e., 4GB), > and this object initialized properly and quite quickly (i.e., no paging). > > However, when I tried 2^33 bytes long (i.e., 8GB), the program works, but it > pages like crazy. But I have 10GB of RAM free, so there should be plenty > of room for an 8GB object completely in RAM in a trivial program like this. > > I checked 'prlimit -l', and both the hard & soft limits seem to allow 131072 > pages of 'memlock'ed (non-pageable) memory. > > However 'ulimit -l' reports a limit of only 65536 pages of memlock'ed > memory. Even after fiddling with /etc/security/limits.conf, I still couldn't > increase the amount that 'ulimit -l' returns. > > I'm currently not running a pure Linux on this machine, so I can't tell if > the problem is Linux (Ubuntu) itself, or whether the problem is WSL. > > Any thoughts ? I don't really know, but if you send me the code, I can run it on a Linux system with 16GB or 32 GB of memory if you want to see if there's a difference. |