Is it correct that madwifi or the card itself does the association in the
lowest bitrate avaiable?
It seems so. I have done some test WITHOUT hostapd/wpa_supplicant to ensure
that's a madwifi issue...
It seems also that madwifi driver does average the signal level in the
output wlanconfig ath0 list and iwlist ath0 scan.
I've done some lab tests
(because my bad throughput, see mail from 23.3. Re: [Madwifi-devel]
Throughput tests with madwifi and hostap/wpasupplicant)
and noticed this.
If I set bitrate to 6mb on a station I have a signal of say -22 on a given
on master (output of wlanconfig ath0 stats on master).
Ping from client to master does not change signal output of master.
But if I set bitrate to 54mb I have still a signal of -22 on master
Ping from client to master does change signal from -22 to -28 but "slowly"
i.e. every 2-4 seconds the signal losses is -1dbm.
And at leats if I set bitrate back to 6mb the signal gain 1 dbm but
stays at -27dbm until I do a ping then goes down the same way ...
I have also done the same test with kickmack after setting the bitrate to
and it's the same behaviour (i.e. first level of -22, then ping, signal
drops to -28)
I always has known that you should do a ping to see the "real" signal value
but never was aware that you must ping at least 20 seconds to know the
So does madwifi really average signal level or is it the hal or hw which
BTW: This behaviour I have with CM9 cards, SR5 not because the output does
change only 1dbm from 6mb to 54mb which is very strange, but this is an
problem for an other email :-)
"Feel free" mit GMX FreeMail!
Monat für Monat 10 FreeSMS inklusive! http://www.gmx.net