Theoretical Wi-Fi decay
- by lithiium
Is there a way to (theoretically at least) calculate the decay on bandwith of a Wifi related to the streght signal? For example, I know that I can theoretically expect 54Mbps of a 802.11g at 100%, which will be the bandwith expected at a 30% of signal? is it lineal? is it the same?
I could not find any source for this, but considering the error replay involved, I guess it should be possible to calculate something like this. Anybody knows?