Why Standards Place Limits on Data Transfer Rates?
Posted
by
Mehrdad
on Super User
See other posts from Super User
or by Mehrdad
Published on 2011-06-25T06:49:18Z
Indexed on
2011/06/25
8:25 UTC
Read the original article
Hit count: 279
This is a rather general question about hardware and standards in general:
Why do they place limits on data transfer rates, and disallow manufacturers from exceeding those rates? (E.g. 100 Mbit/s for Wireless G, 150 Mbit/s for Wireless N, ...)
Why not allow for some sort of handshake protocol, whereby the devices being connected agree upon the maximum throughput that they each support, and use that value instead? Why does there have to be a hard-coded limit, which would require a new standard for every single improvement to a data rate?
© Super User or respective owner