Hi all. I noticed that some time between the (old) 500 version and current
r77 that a running average of 10 samples computed each 100 samples is used
to report link quality. For me this causes a few problems.
1) In the old version, I could count on the auto-matic channel scan to
"clean out" the signal quality so that when the channel settles on the one
set, the link quality is (perhaps) low, like "8". Now, if the channel
settles to a clear channel, the link quality remains exactly as it was for
the previous channel. Since setting the link quality is indirectly driven
by rx frames (in zd1205_validate_frame) then clear channels can't (re)set
link quality. This breaks my algorithm for clear-channel hunt.
2) I also had running average code in the application level which does
something similar as the new code in the driver, which is now useless since
the driver link quality is very chunky (each 100 rx frame).
My feeling is that link quality smoothing is an application issue not a
driver issue, and I would be willing to modify the current code to provide a
smooth instantaneous running average of quality from the driver which would
give better results than current method, but would also reset on channel
scan which I think should be done in any case.
I propose to modify link quality / signal strength averaging using an
algorithm similar to the below which allows a much quicker response to apps
asking for quality upon essid/channel selection and a smoother running
average. Any time a channel change/essid change ioctl is processed I would
reset avq/avsamp back to 0. Please let me know if this makes sense, and if
I should ask for CVS write access or submit a patch (how do I do that?)
static int avq = 0;
static int avsamp =0;
avq += new_link_q;
link_q = avq / avsamp;
if(avsamp > 32) // this number modifies the time-sensitivity of the
average, i.e. the damping
avq /= 2;
avsamp /= 2;
Is your PC infected? Get a FREE online computer virus scan from McAfee®