Just Launched: You can now import projects and releases from Google Code onto SourceForge
We are excited to release new functionality to enable a 1-click import from Google Code onto the Allura platform on SourceForge. You can import tickets, wikis, source, releases, and more with a few simple steps. Read More
I am using Stage as the new simulation device for our robot control software
at the Technical University of Denmark. It is used in most of our courses,
and it typically means 5 to 10 instances of "libstage" being executed at the
same time on your main server. We use the 2.1.1 tar-ball from sourceforge.
My problem is that when I simulate a laserscanner on the robot, the
performance drops drastically (from x10 to x1.4). Running multiple instances
the performance is x0.05 to x0.1 which is not very satisfactory.
As I understand it the robot HW and sensors are all synchronous now - is
that correct? I have was wondering if I can subsample the sensors (e.g.
laserscanner) and in that way inhance performance. The robot HW loop runs
100Hz and the laserscanner 10Hz (other sensors 6Hz), that would lower the
computation time in most simulator updates.
Not only would it improve performance, it would also make the simulations
resemble the real world better.
Is it possible to improve the performance in other ways, if I do not want to
lower the accuracy?