Menu

#11 PLViewer uses much CPU time when minimized

open
nobody
None
5
2012-12-28
2012-04-02
Anonymous
No

PLViewer when minimized it still consumes as much CPU time as in active mode.

Discussion

  • Christian Ofenberg

    As far as I remember we don't make the decision how one has to use the engine. So, having an application running in the background is in general a valid situation, even when rendering is used and the window is not visible. The application may render into textures or do another processing, even by using scripts additional work may be done when an application is minimized so there can't be a fixed build in pre-set configuration deciding that he application has to freeze when it's minimized. When of course just using PLViewer, which itself has only a little of own implementation, to e.g. only view 3-D scenes, this behaviour may look odd indeed.

    Maybe it would help having a method which can be used to tell the application that it is allowed to sleep then it's minimized so that the application programmer can make this decision on his own. When using PLViewer to run own scripted applications one would be able to call this method. When using C++ to implement own applications one would be able to call this method as well when it's known that the own application does not need to perform background processing.

     
  • Anonymous

    Anonymous - 2012-04-07

    Why did you decide to use the processor by default all the time? Why not use it on demand? Or is it technically difficult?

     
  • Anonymous

    Anonymous - 2012-04-07

    Or you do so because they want the process is always time to perform the tasks? But why did not raise the priority of the process?
    Actually, I wrote about PLViewer, this is clearly a glitch for scene viewer.

     
  • Christian Ofenberg

    Deciding to use the processor by default all the time - is this really something one can decide when waking up in the morning? Did I miss anything in the last 20 years of playing around with software development? As far as I'm concerned it's a software engineering topic. If it's possible to use an event driven architecture, well, sure, may by a good thing. PixelLight itself is also using such an approach at many places to only do work when it really has to be done. On the other hand there are things like a "main loop" which are an usual approach. Having a 100 % event approach would be impossible for PLViewer without heavily limiting it's usage. PLViewer itself consits, by design, just of a few lines of own codes - see it as an interface to the provided application framework which is itself just a layer on topic of other PixelLight components (-> software engineering topic as mentioned). So, in my humbe opinion it's no "glitch" in PLViewer nor the application framework nor PixelLight as a whole. It's a topic of having something universal instead of running towards a single spot. Sure, if you "just" want to few a static scene and knowing exactly that there e.g. just wsda keys to move the camera, one can react when one of those keys was pressed and only then performing an update. This is of course also possible when creating an application by using PixelLight and also had already been done for an application (meaning 0 percentage CPU usage when doing nothing on screen). But this would be far from universal, it's up to an application programmer to decide creating such a highly specialiced application for a certain purpose. Nothing we want to have within the PixelLight project itself. Nothing a simple generic application like PLViewer has to concern about.

    So, from my point of view, the only discussable point is to whether or not to pause an application by default in case the window is minimized. This would of course assume that an application has only one window, meaning this would be something we have to hack into PLViewer directly. Which of course would already limit it's universal usage by putting restrictions onto the system by making assumptions about the usage. Meaning putting unavailable information into the generic viewer application which is itself designed to be just a thin layer over the thin application framework layer. This would also require to add additional logic to PLViewer, I really feel quite uncomfortable when thinking about all the following consequences by introducing special cases.

     
  • Anonymous

    Anonymous - 2012-04-12

    Okay. I it seems understand your point of view.
    Maybe I did not know enough and I do not understand something, but I mean this:
    As I understand the main load is visualization?
    1. But if the window is minimized, there is no need to visualize. Enough to handle what is happening in the scene (animation, physics, etc.). If nothing happens in the scene, then the load on the processor does not make sense. Or it is impossible to know when that happens in the scene?
    2. In the case of two windows, if a window is minimized, then why this window to render?

     

Log in to post a comment.