Obama administration aims for bottom-up approach to creating global standards for protection of IT and critical infrastructure.
Advocates of the emerging 802.11n Wi-Fi standard hail its promise of faster speeds and greater range. Although that’s enticing, it’s not enough to convince large organizations to toss aside their tangled web of wires. Improved Wi-Fi performance doesn’t deliver enterprise-scale opportunities unless it comes with improved reliability.
The problem of reliability on the unlicensed spectrum bands is potentially greater with 802.11n than with earlier Wi-Fi specifications. The emerging standard’s improved range and speed stem largely from the addition of multiple-input, multiple-output (MIMO) technology, which uses many receivers and transmitters in clients and access points. One MIMO configuration, spatial multiplexing, allows multiple data streams to flow simultaneously on one channel of bandwidth.
While the increased number of radios — and the increased number of data streams sent between transmitter and receiver — boost throughput, they are also likely to exacerbate interference, not to mention the cost and complexity of dealing with it.
“The key thing with 802.11n is that there are more things that can actually go wrong,” says Bill Kish, co-founder and chief technology officer of Ruckus Wireless, a maker of smart antenna systems. “Potentially what you have here is a quadrupling of throughput, but in reality the percentage of time you can use these technologies is going to be extremely environmentally dependent.”
One way around the interference problem is the use of agile antennas that adapt to the changing radio frequency environment. Ruckus’ BeamFlex technology, found in network components from Netgear and other Wi-Fi product manufacturers, provides an antenna array with six directional antenna elements that can form up to 63 different patterns. The system continuously evaluates the optimal antenna pattern for each receiving device, based on application-level criteria, such as jitter and throughput, and it reconfigures itself to send signals around interference.
“Antennas are about balancing the trade-offs of price, size and performance,” Kish says, adding that the optimal trade-off for 802.11n is a little different than for the previous version of the standard because its requirements are more stringent. Optimal performance with 802.11n, in some cases, requires more help, which could be expected to come in the form of larger, more expensive antenna systems, he says.
Alternatively, the BeamFlex antenna technology can be added to 802.11g products, generating the benefits of MIMO without the multiple radios, which account for much of the added cost of 802.11n systems, Kish says.
Smart antenna systems present a great start for making Wi-Fi more reliable, says Dale Hatfield, former chief technologist at the Federal Communications Commission, who is now an independent consultant. More concerted coordination among users, particularly in controlled environments, can also be beneficial, he says.
“You can act as the spectrum manager in an environment where you have some control,” he says, adding that somewhat insular sites provide the best opportunity. “If I was on a military base and I could have some control over what other unlicensed devices are used, it can be a reliable system.”
In less controlled environments, quality of service can sometimes be achieved by segregating classes of users, says Miguel Pagan, CISSP, a project manager with Eagan, McAllister Associates on assignment at the Navy’s Bureau of Medicine and Surgery. “With 802.11n, we’ve heard the horror stories about heavy interference. In the health-care environment, that’s something we’re concerned about,” Pagan says, adding that competing applications include medical data available to doctors, location tracking and Internet access for patients and families. “You don’t want [interference] around pediatric electrocardiograph monitors, for example.”
The ability to separate different types of traffic will become critical as agencies deploy Wi-Fi systems that support many competing applications, he says, adding that some vendors are addressing the reliability questions more aggressively than others.
One way to segregate users is to assign them to separate channels on the LAN. Agencies that host contractors and other guest users can then apply different security policies to external groups that are already quarantined. Extricom, a maker of wireless LAN systems, offers this kind of physical separation of user groups.
The capacity bump between 802.11g and the forthcoming 802.11n would propel Wi-Fi throughput from 54Mbps to 248Mbps.
The deployment of enterprisewide Wi-Fi — with potentially scores of competing applications and scores of access points — raises questions not only of reliability but also of manageability.
“Looking ahead, we know that customers of all kinds and all verticals will want to see what this looks like on a systems level,” says David Confalonieri, an electrical engineer and vice president of marketing at Extricom. “If I have five access points and only one channel, where do I put the channel, at which access point, and what do I do with the other four?”
Extricom addresses the problem, in part, through plug-and-play, interchangeable access points that contain no software and require no configuration. The access points connect to Extricom’s wireless LAN switches, where the system intelligence resides.
“The way we use the channels allows us to overcome the systems-level challenges that existed with 802.11a/b/g, which I think will become more pronounced in the 802.11n era,” Confalonieri says. Extricom’s architecture, called channel blanketing, makes a given channel available to all the access points in the LAN, creating one large, contiguous Wi-Fi zone with seamless mobility.
“As an IT person, you’re not trying to decide which access point is trying to use which channel. You’re letting the network deal with avoiding co-channel interference,” Confalonieri says. “You can make wireless look like wire without tearing your hair out.”