My Remarks about Future Internet Proposals

I participate in the Future Internet Architecture (FIA) research working group created by the European Commission in May 2010. The main objective of FIA is to

define a common set of architectural design principles and a common reference architecture that can guide and unify key technologies developments

Via that group (and also through various workshops) I get exposed to different Future Internet architecture ideas. Recently, I ran into this RINA architecture proposal and looked at the slides with the title ‘Networking is IPC: A Guiding Principle to a Better Internet‘.

There, the authors complain about the [Internet] as being a “fundamentally broken architecture” and argue it is a bunch of hacks (with “no or little science”).

I was curious about what is broken. Have a look at their analysis on slide 5, 6 and 7. Slide 6 talks about addressing and routing and slide 7 about adhoc scalability and security.

I wish if authors could provide a bit more background. I fully understand that authors want to create something new without spending too much time with the past.

Anyway, I thought I should read through their paper, which could have provided more insight. Here is the paper.

Here is the analysis of today’s problems from their paper:

Today, the pure form of the Internet’s best-effort delivery model has not been able to effectively respond to new requirements (e.g., security, quality-of-service, wireless, mobility). Many individual networks in the Internet today represent commercial entities—Internet Service Providers (ISPs). An ISP may be willing to provide better than best-effort service to its customers or its peers for a price or to meet a Service Level Agreement (SLA). The lack of a structured view of how this could be accomplished has led to ad hoc solutions and so-called “layer violations” where in-network elements (e.g., routers, proxies, middleboxes) deeply inspect passing datagrams so as to perform application- or transport-specific processing.

Many papers follow the same pattern.  No analysis of today’s deployment problems immediately followed yet-another solution proposal.
[Note: In discussions I was told that I have to buy the book of the authors, which contains the analysis.]

Unfortunately, for the computer science publication industry it does not seem to beneficial to actually explore a problem in a scientific way anymore. It would take a while to figure out what the problems are with and the conclusions may potentially reveal that their new ideas don’t solve the problem either.

Hence, here is a call from my side to funding agencies to support activities that try to shed some light on real-world problems. I want those to be documented and I don’t want to hear about the author’s solutions right away.

Interestingly, some of these publications exist (although not too many of them, IMHO). For example, take a look at “An Experimental Study of Home Gateway Characteristics” or “An Untold Story of Middleboxes in Cellular Networks“.

[Note: I am not against future Internet architecture ideas. I am always excited about new ideas. I just want the authors to be honest. If they just don’t care about prior work or deployment problems then they should be allowed to say that.]

Leave a Reply

Your email address will not be published. Required fields are marked *