[Phonopy-users] phono3py memory
Brought to you by:
atztogo
|
From: Tomas K. <tom...@st...> - 2018-08-29 10:11:25
|
Dear all, I am currently trying to calculate the lattice thermal conductivity for a rather big system (106 atoms in the primitive unit cell) in a 2x2x2 super cell (-> 848 atoms in total). I successfully calculated all 19371 displacements and was able to generate the FORCE_SETS afterwards. However, when I try to calculate the FORCE_CONSTANTS from that (phono3py -v --dim="2 2 2" --sym-fc -c POSCAR), I always end up with errors. When I use our normal 64 GB nodes, I obviously receive a memory error. That is why I even tried to run the calculation of the third order force constants on a 1.5 TB node, but a segmentation fault with less obvious origin occurs at some point (see attachment). During the calculation on the 1.5 TB node I monitored the performance and observed that it the needed memory is much less than the maximum of the node. Therefore, I assume that the remaining error is not connected to an insufficient computational infrastructure. However, I wonder, if there are "internal" limitations in the code (maybe of numpy's pinv function) which prevent me from carrying out the calculation of the third order force constants. Maybe someone of you has experience with the memory consumption of phono3py for larger systems and can share it with me. Alternatively, a way in order to find out, what else could be the problem, would be very helpful for me. Reducing the system's size would be only an unfavourable solution. Thanks in advance! Best regards, Tomas |