^
XPath error : Invalid expression
ERROR: Invalid xpath expression in
../nucnet-tools-code/vendor/libnucnet/0.32/src/Libnucnet__Reac.c on line 192
I think that this is a problem with libxml2 being unable to process the
event due to insufficient size. I don't know if that size is due to the
size allocated for xml2 or because of RAM limitations but I was wondering
if (a) you know which case it is and (b) if there was a simple way of
fixing it.
Ubuntu 16.04. Tried on my laptop:
free -g
total used free shared buff/cache
available
Mem: 3 2 0 0 0
0
Swap: 3 2 1
and the processing machine that I actually want to use:
free -g
total used free shared buff/cache
available
Mem: 46 15 20 0
10 29
Swap: 14 8 5
Thanks in advance for your help.
Dr Philip Adsley MA MSci (Cantab)
Postdoctoral Research Fellow, CNRS, Orsay
because of the memory limit. It succeeded on a 3Gb machine. This obviously needs some work. I hate to ask you to do this, but can you find a larger machine to run on for now? We hope to work on improving these decay codes soon. Best wishes.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I've asked our IT department whether there's another machine that I can try
running it on. We're also going to look into recompiling the libxml2
libraries because (apparently, saith The Google) this can be due to a
limitation how much information xml information can read in at a time.
I'll let you know if I turn anything up. Thanks very much for testing this
and getting back to me so promptly.
Dr Philip Adsley MA MSci (Cantab)
Postdoctoral Research Fellow, CNRS, Orsay
because of the memory limit. It succeeded on a 3Gb machine. This obviously
needs some work. I hate to ask you to do this, but can you find a larger
machine to run on for now? We hope to work on improving these decay codes
soon. Best wishes.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Anonymous
Anonymous
-
2022-04-20
This is Phil again, only a few years later.
The problem was that they put a hard-coded limit in libxml2 when reading in files which gave a maximum size. I got around it by compiling my own libxml2 on my computer which means that I can now continue trying to run through the 18F+n calculations as described in the Bojazi and Meyer paper.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I'm getting the error below when I try to run the nuclear_decay example.
./compute_decayed_abundances ../nucnet-tools-code/data_pub/s25a28d_expl.xml
10000. "" s25a28d_expl_decayed.xml
XPath error : Memory allocation failed : growing nodeset hit limit
growing nodeset hit limit
^
XPath error : Invalid expression
ERROR: Invalid xpath expression in
../nucnet-tools-code/vendor/libnucnet/0.32/src/Libnucnet__Reac.c on line 192
I think that this is a problem with libxml2 being unable to process the
event due to insufficient size. I don't know if that size is due to the
size allocated for xml2 or because of RAM limitations but I was wondering
if (a) you know which case it is and (b) if there was a simple way of
fixing it.
Ubuntu 16.04. Tried on my laptop:
free -g
total used free shared buff/cache
available
Mem: 3 2 0 0 0
0
Swap: 3 2 1
and the processing machine that I actually want to use:
free -g
total used free shared buff/cache
available
Mem: 46 15 20 0
10 29
Swap: 14 8 5
Thanks in advance for your help.
Dr Philip Adsley MA MSci (Cantab)
Postdoctoral Research Fellow, CNRS, Orsay
Dear Dr. Adsley,
Thanks for your post. I'll be looking into it over the next few days. Best wishes.
Hello,
The following failed for me on a 2Gb machine
./compute_decayed_abundances ../nucnet-tools-code/data_pub/s25a28d_expl.xml 10000. "" s25a28d_expl_decayed.xml
because of the memory limit. It succeeded on a 3Gb machine. This obviously needs some work. I hate to ask you to do this, but can you find a larger machine to run on for now? We hope to work on improving these decay codes soon. Best wishes.
Wilco.
I've asked our IT department whether there's another machine that I can try
running it on. We're also going to look into recompiling the libxml2
libraries because (apparently, saith The Google) this can be due to a
limitation how much information xml information can read in at a time.
I'll let you know if I turn anything up. Thanks very much for testing this
and getting back to me so promptly.
Dr Philip Adsley MA MSci (Cantab)
Postdoctoral Research Fellow, CNRS, Orsay
On Sat, 6 Oct 2018 at 21:11, Bradley S. Meyer mbradle@users.sourceforge.net
wrote:
This is Phil again, only a few years later.
The problem was that they put a hard-coded limit in libxml2 when reading in files which gave a maximum size. I got around it by compiling my own libxml2 on my computer which means that I can now continue trying to run through the 18F+n calculations as described in the Bojazi and Meyer paper.