Menu

Force software rendering at run time

Help
2022-12-01
2022-12-02
  • Mark Oxford

    Mark Oxford - 2022-12-01

    Hi,
    I'm not a programmer - I'm just a geriatric user of a 32 bit, 2D model railway track planning application that uses GLScene to do a 3D render of the finished design.

    The issue is that I encounter RAM out of memory errors when rendering larger designs. By disabling the video driver this problem does not occur and memory usage is minimal. This has been tested across five different W10 machines ranging from a budget Lenovo laptop to a brand new system with an RTX 3060.

    Compared to memory use with the video adapter disabled, the additional memory used is about double for the Lenovo, about 10X for an RX 570 system and even more for the RTX 3060 with the time taken to render the scene increasing by even larger amounts. Enabling the Large Adress Aware flag lets me get a bit further, but it is still a major hassle to have to continually enable/disable the video driver depending on what I'm doing.

    Filling 3 GB of RAM slowly over 12 seconds or so seems excessive for a simple static image of basic shapes with plain textures - especially when only 100 - 200 MB are used quickly for the same job with the video adapter disabled. Playing with all the 3D video settings I am aware of doesn't seem to change anything.

    I have been in touch with the dev and he hasn't been able to determine what the issue is, but I'd like a work around.

    Is this a known issue? Is there any way an end user can force software rendering? Is there some registry key or setting that GLScene checks when deciding whether to use hardware or software rendering that might be useful? Is there some sort of switch that a dev could include in the app that would allow the end user to force software rendering?

     

Log in to post a comment.