Is there a way to force the items encoding to stay grouped together in the queue instead of being spread out towards the end? I lose multiple encoding streams of crunching power as they fall off the list instead of selecting what's left. A multi of 7 may have only 2 left with a lot to do still. Yes, sorted by size.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Despite my efforts to keep the algorithm from spreading through the list, it seems it occurs after some time encoding. It starts encoding in blocks of selected maxthreads count.
Anyway it will always encode simultaneously as many files as maxthreads count.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Anyway it will always encode simultaneously as many files as maxthreads count.
That does not hold true for me. If I have 2 or more encodes going, depends upon what I'm encoding, then at some point they get spread out like Paul is talking about and there will be 2, 3 or 4+ files still queued between active encodes. When one completes it jumps to the next but at times is there is an active encode after that group then when it completes and jumps until it completes all the queued files before the other one that was jumped then that encode stream will just stop....and then I have only 1 encode running...or 2....but at times I definitely loose an encoding pipe like it ran into a wall when it hit an item in the queue that was already completed.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
It seems that the loop partitions the whole file list in n-threads blocks (4) and after a while, it begins spreading to the first file of every n-block.
Rewriting this routine is out of my scope. Multi-file should only be used when encoding audio only, or when using gpu encoding supporting more than one parallel encoding. In any other case, it is a waste of resources.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
or when using gpu encoding supporting more than one parallel encoding
That is the only time I use it, the GPU in this machine supports 5 parallel encoding sessions and I am only using 4. I'm not complaining Abel, your program is amazing and I love it.....I'm just reporting back is all. It usually only happens towards the end of the queue so if I understand you the queue is broken into 4 file blocks and that makes sense....because it is only when you get near the end of the queue NEW encoding session inside a new 4 block chunk is not possible so the existing encoding sessions have to finish out their blocks?? Is that what you are saying?
Like I said, not complaining just wanting to understand.
Thanks as always Abel.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Is there a way to force the items encoding to stay grouped together in the queue instead of being spread out towards the end? I lose multiple encoding streams of crunching power as they fall off the list instead of selecting what's left. A multi of 7 may have only 2 left with a lot to do still. Yes, sorted by size.
Despite my efforts to keep the algorithm from spreading through the list, it seems it occurs after some time encoding. It starts encoding in blocks of selected maxthreads count.
Anyway it will always encode simultaneously as many files as maxthreads count.
That does not hold true for me. If I have 2 or more encodes going, depends upon what I'm encoding, then at some point they get spread out like Paul is talking about and there will be 2, 3 or 4+ files still queued between active encodes. When one completes it jumps to the next but at times is there is an active encode after that group then when it completes and jumps until it completes all the queued files before the other one that was jumped then that encode stream will just stop....and then I have only 1 encode running...or 2....but at times I definitely loose an encoding pipe like it ran into a wall when it hit an item in the queue that was already completed.
Like right now.....Supposed to be 4 encodes running but only 3
Last edit: Ronald J Kienle 2023-10-28
It seems that the loop partitions the whole file list in n-threads blocks (4) and after a while, it begins spreading to the first file of every n-block.
Rewriting this routine is out of my scope. Multi-file should only be used when encoding audio only, or when using gpu encoding supporting more than one parallel encoding. In any other case, it is a waste of resources.
That is the only time I use it, the GPU in this machine supports 5 parallel encoding sessions and I am only using 4. I'm not complaining Abel, your program is amazing and I love it.....I'm just reporting back is all. It usually only happens towards the end of the queue so if I understand you the queue is broken into 4 file blocks and that makes sense....because it is only when you get near the end of the queue NEW encoding session inside a new 4 block chunk is not possible so the existing encoding sessions have to finish out their blocks?? Is that what you are saying?
Like I said, not complaining just wanting to understand.
Thanks as always Abel.
It's ok, it just that such behaviour annoys me, it's not what it is supposed to do.
This beta executable could get to fix this behaviour.