By Kate Murphy
WITHIN the last year there have been 16 so-called fiber cuts in the San Francisco Bay Area. According to the F.B.I., someone or some group has been going through manholes to sever fiber optic cables that supply telecommunications to large sections of the region, which is home to technology companies, academic institutions and Lawrence Livermore National Laboratory, overseer of the nation’s nuclear weapons.
Following each incident (usually occurring late at night and involving two or three separate fiber cuts) residents couldn’t make land or mobile calls, not even to 911, or send texts or emails. Hospital records in some instances were inaccessible. Credit cards and A.T.M.s didn’t work. And forget about Googling, watching Netflix or remotely turning on a coffee maker. (For security reasons, Lawrence Livermore declined to say how the cuts affected its operations.)
When we talk about the Internet, we talk about clouds and ether. But the Internet is not amorphous. You may access it wirelessly, but ultimately you’re relying on a bunch of physical cables that are vulnerable to attack. It’s something that’s been largely forgotten in the lather over cybersecurity. The threat is not only malicious code flowing through the pipes but also, and perhaps more critically, the pipes themselves.
Most worrisome are the throughways and junctures that handle enormous amounts of Internet traffic. It would be as if a major interstate highway or crucial interchange were closed and all the traffic was forced onto side streets. There would be gridlock, and some of those side streets might collapse under the weight. Data transfer would slow significantly or come to a halt, as has happened in Northern California.
Surprisingly, there isn’t even a good map of the Internet’s highways and byways to clearly show locations that, if taken out, would severely hamper the system. “Everybody assumes somebody knows, but after a while you find out nobody actually knows,” said Paul Barford, a professor of computer science at the University of Wisconsin who has made it his mission to find out where the vulnerabilities are.
He recently completed a map of the United States’ long-haul Internet infrastructure — stretches that span at least 30 miles and connect population centers of at least 100,000 people. It took him four years of cajoling information from commercial broadband providers and collecting public records to come up with a reasonably reliable map. Notably, his research was partly funded by the Department of Homeland Security and can be accessed only by D.H.S.-approved researchers.
“What we’re trying to avoid is giving bad guys a map to do bad things,” Professor Barford said. “Now that we can see the possible pinch points in the U.S., we are looking at ways to mitigate them.”
Security experts and networking engineers said they were most concerned about where major networks converge. These are called Internet exchange points, or I.X.P.s, where networks come together like highway interchanges to trade traffic, which is known as peering.
There are about 80 I.X.P.s in the United States but only a handful, including ones in New York City, Miami, Los Angeles, Seattle and outside Washington, are vital interchanges for domestic as well as international traffic coming from undersea cables from abroad (which are also vulnerable to cuts by mislaid anchors or submarine sabotage).
Plugging into these major hubs are hundreds of Internet and mobile service providers, as well as content delivery networks such as Google, Apple, Amazon, Facebook and Microsoft. If taken out by natural disaster (earthquake, hurricane) or a strategic attack, much of the United States, if not much of the world, would have hindered Internet access or none at all, depending on the severity and sophistication of the strikes.
“It’s crazy to see these unprotected buildings containing all this physical cabling that’s interconnecting continents as well as all of North America,” said John Savageau, an information and communications technology consultant who formerly managed I.X.P.s owned by the CoreSite Realty Corporation, a major player in the industry. “If one of these major nodes goes down, you’re going to have pain because customer performance will be seriously degraded, but if you have a coordinated attack on multiple locations, that’s a nightmare scenario.”
Indeed, many I.X.P.s are in old, unprotected buildings, some former telegraph offices. Often it’s possible to lease adjacent office space in the buildings. Sometimes there aren’t even security guards in the lobby. And the manholes around the buildings are also unprotected.
“I guess it’s a hide-in-plain-sight strategy,” said Jim Poole, vice president for global providers for Equinix, another company that owns I.X.P.s (some more protected than others). “I would hazard a guess that if an I.X.P. is not very secure, they are probably so obscure no one would know they were there.”
But there are websites that list most of them, as well as which networks — telecoms, content providers, municipal governments and academic institutions — traffic data through them.
Unlike data centers, there are no recognized standards for building or maintaining I.X.P.s. The Department of Homeland Security, which is responsible for critical infrastructure, has no requirements for the physical protection of I.X.P.s nor does it have any rules against ownership by companies affiliated with a hostile foreign state — despite the possibility of a purposeful choke or listening in on traffic running through an exchange.
“The snooping issue is why you don’t necessarily want a government agency monitoring I.X.P.s,” said Larry Ponemon, chairman of the Ponemon Institute, a research organization that focuses on cybersecurity. “It’s a big problem because I.X.P.s are crucial to the efficient operation of the Internet but most of them are privately held with very few controls.”
For Bill Woodcock, executive director of Packet Clearing House, a nonprofit research institute dedicated to supporting Internet traffic exchange technology, the solution is not to create regulations that would make it more expensive to build I.X.P.s, but to build more of them so none are critical.
“If you create redundancy, it doesn’t matter if it’s in a mop closet,” he said, referring to one heavily trafficked I.X.P. in a former janitor’s closet on an upper floor of an old building. He added that the situation was far worse in Europe, where some countries rely primarily on a single I.X.P. — albeit one more fortified than a mop closet.
Mr. Barford at the University of Wisconsin agrees that abundant routes and exchanges are crucial to the Internet’s resilience. But he said the Federal Communications Commission’s recent net neutrality decision might paradoxically make that less likely, if it’s interpreted to mean networks must share infrastructure as well as offer equal access to bandwidth. This would result in the cost-saving practice of threading new fiber optic cable through the same conduits and running them up the same poles as existing lines. Google has argued for that right in communications with the F.C.C.
The trouble is that cables running in a single licorice-like twist are easier to disable in one quick cut. It’s a reason businesses are turning to companies like Zayo, Allied Fiber and Integra that build alternative dark fiber networks that customers can “light” (i.e. shoot data through them using laser pulses akin to Morse code) to diversify their routing. Think of dark fiber networks as private access toll roads you can jump onto avoid traffic jams.
“The only way to solve this problem is to create a more robust network so you don’t have these single points of failure,” said Hunter Newby, founder and chief executive of Allied Fiber. He added that nothing is foolproof, however, no matter how many redundancies there are: “I always remind people that planet Earth is a single point of failure. Just ask the dinosaurs.”
Information from www.nytimes.com