The internet remains resilient and its underlying protocols and technologies dominate global networks, but its relevance may be challenged by the increasing amount of traffic carried over private networks managed by big technologies or by rules imposed by governments.
So says a Internet Technical Success Factors Study commissioned by APNIC and LACNIC – the regional Internet address registries for the Asia-Pacific and Latin America and Caribbean regions respectively – and written by consulting firm Analysys Mason.
Presented at the 2021 Internet Governance Forum (IGF) on Wednesday, the study identifies four reasons why the Internet has been successful:
- Scalability supporting the growth of the Internet;
- Flexibility in network technologies;
- Adaptability to new applications;
- Resilience to shocks and changes.
The study also argues that the early Internet designers incorporated three essential guiding ideals: openness, simplicity and decentralization. These ideals were applied according to three design principles: layering, the creation of a network of networks, and the end-to-end principle that intelligence is placed at the edge of the network rather than at the core.
The end-to-end principle is important, as it means that applications can be installed in connected devices without the need to change networks.
Much of the study fondly recalls how the aforementioned elements enabled decades of useful innovation.
A significant fraction of global traffic is now moved between the data centers and edge networks of large Internet companies
The document also identifies the risks.
A section on the technical challenges to the success of the Internet points out that the architecture has weaknesses and the technologies to strengthen them are not widely adopted.
“Although DNSSEC and BGP security extensions are important steps towards securing the Internet infrastructure, significant efforts will still be needed before these protocols are widely deployed and used. ‘study.
The absence of an appropriate quality of service (QoS) standard is also criticized, as its absence has created “concerns … that the best effort model will not be sufficient to meet the needs of emerging cross-domain applications such as Reality. augmented / virtual or interactive game “.
Imposing a quality of service standard would threaten the network of networks principle, the study said, adding that any attempt to change Internet protocols would likely be rejected, if only because the world has invested so much. ‘efforts in current networks.
But the study identifies some players who may decide to go their own way: “social media companies, video streaming companies, CDNs and cloud companies.”
The document states that “a significant fraction of global IP traffic now consists of data that is moved between data centers and the edge networks of large Internet companies.”
The needs of these companies and the growth of networks lead analysts to suggest that “over time we may see the Internet transform into a more centralized system with a few global private networks carrying most of the content and services.
“In this scenario, what remains outside of these private networks are primarily ISP networks that move traffic to and from end users, and the user experience would be shaped by a user’s proximity to a user’s private network. the relevant Internet company. “
The study also suggests that Big Tech could research the protocols it needs and in doing so, take resources away from working on open Internet protocols. While such work should be interoperable with the Internet at large, and therefore preserve the principle of the network of networks – the document cites the development of the alternative QUIC protocol TCP as an example of a successful private technological push – it also suggests “increased centralization could blur the distinction between network and applications, as expressed by the principle of overlay.”
Another risk is that when private networks go down, many users suffer. Exhibit A: The AWS blackout from yesterday, which hurt Netflix and Disney +, among others.
The study also identifies governance issues as an emerging risk, especially when countries seek to impose their own demands on the internet.
“A development where governments gain more control over the development of the Internet may involve the risk of a more fragmented system, without the common address space and global accessibility that we have today.”