Hi, I noticed a difference between the Input Level and my Mixer Level. I mean, if I've for example my input level about to peak the red LED, my Mixer Level is always peaking at about 2 LEDs less. Is this normal or there is some "leak" somewhere?
I'm running last version both client and server.
Thanks. Ciao
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
You don't say what your operating system is, but there are often multiple places where the sound stream can be attenuated. Have you checked that they are all at 100% ?
I have not checked this out myself, but I have heard that the levels in Jamulus are more to be seen as an indication to avoid clipping and are not standard measurements.
There was recently a post where someone measured the levels, but I can't find it of hand.
EDIT:
Here is that post: https://sourceforge.net/p/llcon/discussion/533517/thread/d0904ce74b/#d369
Last edit: DonC 2021-01-14
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
If you send a steady state signal (e.g., tone), the Input and Mixer levels agree. But normal audio is not steady state; it's peaky. And I understand that the meter updates from the server Mixer are less frequent than from the local Input and may be averaged over more samples. Thus the Mixer meter either misses many peaks or is averaged over a longer period. Classic difference between the VU meter and the Peak Program Meter in the analog world (but that's ancient history).
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I noticed this on OSX but is evident also on Windows and Linux. I did this test: Sent a steady 440 Hz signal and the Mixer Level is always 1 LED less than the input one. Attached an image capture during the test. To be honest this does not concern me, it's just something I noticed to share with you. But I ask myself if can have any relevance while setting the max level to be sent to the server.
Hi, I noticed a difference between the Input Level and my Mixer Level. I mean, if I've for example my input level about to peak the red LED, my Mixer Level is always peaking at about 2 LEDs less. Is this normal or there is some "leak" somewhere?
I'm running last version both client and server.
Thanks. Ciao
You don't say what your operating system is, but there are often multiple places where the sound stream can be attenuated. Have you checked that they are all at 100% ?
I have not checked this out myself, but I have heard that the levels in Jamulus are more to be seen as an indication to avoid clipping and are not standard measurements.
There was recently a post where someone measured the levels, but I can't find it of hand.
EDIT:
Here is that post:
https://sourceforge.net/p/llcon/discussion/533517/thread/d0904ce74b/#d369
Last edit: DonC 2021-01-14
I don't have a explanation for it, but I do recognize the symptom.
Last edit: Luuk 2021-01-14
If you send a steady state signal (e.g., tone), the Input and Mixer levels agree. But normal audio is not steady state; it's peaky. And I understand that the meter updates from the server Mixer are less frequent than from the local Input and may be averaged over more samples. Thus the Mixer meter either misses many peaks or is averaged over a longer period. Classic difference between the VU meter and the Peak Program Meter in the analog world (but that's ancient history).
I noticed this on OSX but is evident also on Windows and Linux. I did this test: Sent a steady 440 Hz signal and the Mixer Level is always 1 LED less than the input one. Attached an image capture during the test. To be honest this does not concern me, it's just something I noticed to share with you. But I ask myself if can have any relevance while setting the max level to be sent to the server.
Last edit: Zyfuss 2021-01-15
I think there could be some attenuation done in the server, to have headroom when mixing.