Volume 2 , Number 10
Nov , 1997
A Geek in Geek Land
This is the greatest network in the world. You don't get a single call during the weekend, because you tore the whole thing down on Friday.
If you are an Elvis fan, you go to Graceland. If you love surfing, you go to Hawaii's North shore. If you love racing, you dream of Indianapolis. And if you are a wirehead, you make your reservations in advance for Networld+Interop.
For the uninitiated, Networld is where networking vendors come to show off their newest products, top notch experts come to give keynote addresses and teach classes, and several thousand people show up to score free T-shirts, get away from work, and find out what new wonders (and problems) they will be working with in six months.
There are lots of good things I could be talking about after this year's Networld+Interop in Atlanta. I could be talking about the many conversations I had, the jobs I was offered or the exhibitions I saw, but instead I want to give you the true heart of Networld, which is making things work, and having fun doing it. The best way to tell you about this is to talk about the NOC.
Behind all the glitz and all the demonstrations lies a truly amazing network. This network is developed, staged, deployed and destroyed by a handful of networking engineers working as volunteers in the Network Operations Center (NOC). The NOC is the heart of the entire Networld network, known as the Interopnet, which is set up on Monday, debugged on Tuesday, maintained through the show, and torn down on Friday.
The Interopnet is connected to the rest of the Internet through two DS3 lines, one tying into a 3Com router, and the other tying into a Bay Networks router. The BGP routing protocol is used to segment the internal routing tables from those of the ICON ISP they tie into, and OSPF is the only routing protocol used within Networld.
Each exhibitor is given a /22 subnet mask segment, which caused a great deal of confusion at most booths. The normal subnet mask would be 24 bits, so that networking segments would increment within the third octet of an IP address, such as 22.214.171.124, 126.96.36.199, 188.8.131.52, etc. With a 22 bit subnet mask, subnets are divided into networks that run outside of normally anticipated ranges, for instance, 184.108.40.206 was the routing gateway for the range of IP addresses 220.127.116.11 through 18.104.22.168. This caused most of the NOC's problems as exhibitors tried to connect their equipment to the Networld backbone.
The backbone itself was mostly switched for the first time in Networld history. In past years, typical store and forward routing was used to transfer IP packets between nodes. With switching, IP packet headers are read and already being sent out the appropriate port before the entire packet is received. Further, many nodes on the backbone were ATM, or Asynchronous Transfer Mode. ATM uses a fundamentally different approach to Internetworking. Each ATM cell is exactly 53 bytes long, which was viewed as the best compromise between the larger requirement for data transfer and the smaller requirement for smooth video and audio. With the cells being a fixed length, ATM can entrust hardware to do the cell switching, a tactic which is inherently faster than software switching. Static Virtual Circuits were used between ATM nodes, with PNNI being used to connect to exhibitor ATM hardware.
The cable plant was one of the best designed layouts I have ever seen. Pink cables ran the length of the exhibition halls, each containing bundles of 30 fiber pairs. These fiber pairs are spliced into a single connector invented by another NOC team member, Dave Steele, for the Northrup B-2 Bomber program. Along with these ran 3 pair multimode fiber cables (all typical orange), tying the higher speed ATM nodes together at OC-12 speeds. Down to most exhibitor booths blue CAT 5 twisted pair cable drops were stretched with a length of bungee cord tied into the middle. The bungee cord allowed the tear-down team to quickly get their cables above the 15 foot distance required to keep exhibitors semi trucks from snagging the cable, probably taking most of the Georgia World Convention Center roof with them down Route 85.
Keeping an eye on the network was a key consideration for the network planners. Millions of dollars could be potentially lost if a demonstration failed due to a troublesome network connection. The network was monitored using both Cabletron Spectrum and HP Openview, and the network management software which comes bundled with specific hardware devices, like Bay Networks Optivity. Aside from network management software, an entirely seperate flat ethernet network was run to each major node along a seperate physical cable run. This allowed for out-of-band network analysis.
Lots of computer geeks laugh at the thought of network appliances, saying that they would never use one, but the main NOC file server would change anyones mind. Actually called "Network Appliance", this nifty blue box contained 55 Gig of hard drive space and an operating system contained entirely within firmware. This allowed the NOC team to simply turn the computer off when needed. Compare that to the number of steps and amount of time involved in shutting down your typical UNIX file server.
Gigabit ethernet reared it's amazing head at this years Networld+Interop. One of the gigabit switches used in the Networld backbone was 3Com's newest model, serial number 1, signed by the designing engineers. Gigabit ethernet was one of two hot topics at this years Networld, sharing center stage with Virtual Private Networks (VPN's).
Connecting the exhibit halls together was difficult, but not the end of the story. Nearby hotels were hosting some of the exhibitions, each requiring their own connection to the Networld backbone. Last year, the NOC team ran the "fiber run from Hell" under a bridge, and over a couple highways and railroad tracks. Fortunately, this run was still in place and could be used again, but the Westin hotel also needed a network connection. Enter the Canon Canobeam, which used lasers to run a full 100Mbps FDDI link between the two buildings. A camera was used to keep the two line of site connections targeted on each other.
All this technology was amazing, but the truely amazing part about the NOC was it's people. These are the people who coded the IP stacks for Hummingbird software and Windows 95, who have designed cable connectors for the Star Wars project, who are on the Internet Engineering Task Force, and who are the presidents of their own corporations. And the difference between geeks and the rest of the world? These hotshots are always looking for a few people who are willing to learn, so that they may teach. New members of the NOC team, and their coworkers, the Interopnet Team Members (ITM's), are constantly being sought. Volunteers earn no pay, but can absorb a wealth of experience and training from some of the best in the networking business.
Cutest chicks: Novell
5 Links to Make You Think