Hi,
I'm not a programmer - I'm just a geriatric user of a 32 bit, 2D model railway track planning application that uses GLScene to do a 3D render of the finished design.
The issue is that I encounter RAM out of memory errors when rendering larger designs. By disabling the video driver this problem does not occur and memory usage is minimal. This has been tested across five different W10 machines ranging from a budget Lenovo laptop to a brand new system with an RTX 3060.
Compared to memory use with the video adapter disabled, the additional memory used is about double for the Lenovo, about 10X for an RX 570 system and even more for the RTX 3060 with the time taken to render the scene increasing by even larger amounts. Enabling the Large Adress Aware flag lets me get a bit further, but it is still a major hassle to have to continually enable/disable the video driver depending on what I'm doing.
Filling 3 GB of RAM slowly over 12 seconds or so seems excessive for a simple static image of basic shapes with plain textures - especially when only 100 - 200 MB are used quickly for the same job with the video adapter disabled. Playing with all the 3D video settings I am aware of doesn't seem to change anything.
I have been in touch with the dev and he hasn't been able to determine what the issue is, but I'd like a work around.
Is this a known issue? Is there any way an end user can force software rendering? Is there some registry key or setting that GLScene checks when deciding whether to use hardware or software rendering that might be useful? Is there some sort of switch that a dev could include in the app that would allow the end user to force software rendering?
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Hi,
I'm not a programmer - I'm just a geriatric user of a 32 bit, 2D model railway track planning application that uses GLScene to do a 3D render of the finished design.
The issue is that I encounter RAM out of memory errors when rendering larger designs. By disabling the video driver this problem does not occur and memory usage is minimal. This has been tested across five different W10 machines ranging from a budget Lenovo laptop to a brand new system with an RTX 3060.
Compared to memory use with the video adapter disabled, the additional memory used is about double for the Lenovo, about 10X for an RX 570 system and even more for the RTX 3060 with the time taken to render the scene increasing by even larger amounts. Enabling the Large Adress Aware flag lets me get a bit further, but it is still a major hassle to have to continually enable/disable the video driver depending on what I'm doing.
Filling 3 GB of RAM slowly over 12 seconds or so seems excessive for a simple static image of basic shapes with plain textures - especially when only 100 - 200 MB are used quickly for the same job with the video adapter disabled. Playing with all the 3D video settings I am aware of doesn't seem to change anything.
I have been in touch with the dev and he hasn't been able to determine what the issue is, but I'd like a work around.
Is this a known issue? Is there any way an end user can force software rendering? Is there some registry key or setting that GLScene checks when deciding whether to use hardware or software rendering that might be useful? Is there some sort of switch that a dev could include in the app that would allow the end user to force software rendering?
Hello Mark,
Yes, this problem known hard to say why, e.g. 10 years ago after migrating from delphi7 to rad xe the rendering of fractal water in https://sourceforge.net/p/glscene/code/HEAD/tree/branches/Examples/Terrains/LandscapePackage/04-FractalArchipelago/ was very slow on some reason. But other demos from our folder glscene/rendering are working well. You have 2d application so try to use the additional external package graphics32 for TImage32 from https://sourceforge.net/p/glscene/code/HEAD/tree/trunk/external/GR32/ , for this you need to install graphics32 package, remove point in {.$DEFINE USE_GRAPHICS32} in https://sourceforge.net/p/glscene/code/HEAD/tree/trunk/Source/GLScene.inc#l13, recompile and reinstall glscene packages.
Pavel