From: Jason M. <jas...@vp...> - 2010-11-01 21:54:35
|
Hi, As I have mentioned before, I am working on a really cool program for displaying lots of real time data graphically. I am now working away so I have turned to running the script on my laptop (which admittedly is a reasonably powerful DTR, though not as powerful as my desktop). My desktop runs at 125.0 fps pretty much bang on. (it's a quad core 3GHz AMD Phenom with a dual head Quadro graphics card running XP). But on my laptop the fps jumps all over the place, but centres on about 100 fps. (This is a 3GHz dual core with a proper internal NVidia graphics card) Now, the script *HAS* to run at a rate of at least 111.11 fps or it breaks as the streaming data starts to back up and cease to be real time. At first I thought perhaps my laptop was simply too slow to process the data L - bum!! So I took out the rate(125) line, it went to 8k fps. Hum, can't be that then. So I change the rate(125) line to rate(150) and bingo it now works, with a rate of about 148.0 fps. It is as though the calibration on the rate(x) timer is a bit squiffy? Any clues? So, it looks like to make it portable I need a contiguous feedback adjustment of the rate(x) - perhaps that feature should be part of the code? Cheers, Jason. |