After starting application, cpu shows 110% usage (4 core cpu) for do
And, when the simulation start, it increases near to 220%.
It seems that everything executes on Python.
Is it possible to execute the simulation code in a more efficient way?
From: Subhasis Ray <ray.subhasis@gm...> - 2012-11-23 04:50:09
On Fri, Nov 23, 2012 at 12:06 AM, Saeed Shariati <s.shariati@...>wrote:
> After starting application, cpu shows 110% usage (4 core cpu) for do
By default MOOSE runs one "command thread" and as many "process threads" as
the number of cores. The worker threads look for data in a queue in a busy
loop. This causes the high CPU usage. This is intended for heavy
simulations where you have different components of a large model simulated
on different threads (and cores). You can set the environment variable
NUMPTHREADS=1 to use only 1 process thread.
> And, when the simulation start, it increases near to 220%.
That is understandable.
> It seems that everything executes on Python.
No. Python just sets up the simulation and starts by calling C++ functions.
When you say moose.start(time) in Python, that is a C++ function running
the entire simulation.
> Is it possible to execute the simulation code in a more efficient way?
You can look at the regressionTests directory for some examples where the
simulations are executed in C++ without Python.
I would also suggest you read the Programmers guide [
application programming API and the design document to get a handle on the
internals of MOOSE.