The “Robust Yet Fragile” Nature of the Internet (PDF) by John C. Doyle, David Alderson, Lun Li, Steven Low, Matthew Roughan, Stanislav Shalunov, Reiko Tanaka, and Walter Willinger:
The search for unifying properties of complex networks is popular, challenging, and important. For modeling approaches that focus on robustness and fragility as unifying concepts, the Internet is an especially attractive case study, mainly because its applications are ubiquitous and pervasive, and widely available expositions exist at every level of detail. Nevertheless, alternative approaches to modeling the Internet often make extremely different assumptions and derive opposite conclusions about fundamental properties of one and the same system. Fortunately, a detailed understanding of Internet technology combined with a unique ability to measure the network means that these differences can be thoroughly understood and unambiguously resolved. This paper aims to make recent results of this process accessible beyond Internet specialists to the broader scientific community, and to clarify several sources of basic methodological differences that are relevant beyond either the Internet or the two specific approaches focused on here; i.e., scale-free networks and highly optimized tolerance networks.
The paper concludes that the Internet is not as vulnerable to specific attacks on major hubs as is often claimed.