I'm attempting to train the CRF++ program on a large dataset (>10M words), and it's dying every time the memory limitation exceeds 4 GB. It is compiled and running on a Solaris 5.9 system with 24 GB of memory, but each process of crf_learn is limited to 4 GB in size. Is this a limitation of Solaris or is it a pointer type/int overflow issue for CRF? As soon as it reaches 4 GB in memory, the program terminates with the single word "Abort."
Log in to post a comment.