[Humanist] 25.897 battle for the Internet

Humanist Discussion Group willard.mccarty at mccarty.org.uk
Wed Apr 18 06:53:18 CEST 2012


                 Humanist Discussion Group, Vol. 25, No. 897.
            Department of Digital Humanities, King's College London
                       www.digitalhumanities.org/humanist
                Submit to: humanist at lists.digitalhumanities.org

  [1]   From:    Norman Gray <norman at astro.gla.ac.uk>                      (20)
        Subject: Re: [Humanist] 25.894 battle for the Internet?

  [2]   From:    Gary W Shawver <gs74 at nyu.edu>                             (67)
        Subject: Re: [Humanist] 25.894 battle for the Internet?

  [3]   From:    Michael Burden <mburden at ualberta.ca>                      (64)
        Subject: Re: [Humanist] 25.894 battle for the Internet?


--[1]------------------------------------------------------------------------
        Date: Tue, 17 Apr 2012 10:38:15 +0100
        From: Norman Gray <norman at astro.gla.ac.uk>
        Subject: Re: [Humanist] 25.894 battle for the Internet?
        In-Reply-To: <20120417044011.664AD2787B7 at woodward.joyent.us>


On 2012 Apr 17, at 05:40, Humanist Discussion Group wrote:

> Many here will, I suspect, be interested in the (British) Guardian's 
> investigation into the several attempts to control the Internet: "Battle 
> for the internet", at 
> http://www.guardian.co.uk/technology/series/battle-for-the-internet.
> 
> Does anyone here understand how control of the Internet, as is 
> practiced or planned, is done? I was under the impression that the
> decentralised design made such control difficult to impossible. Clearly
> it isn't.

Technically, control of the internet would be challenging because of its decentralised nature, as you note.  A packet going from me to thee can take multiple routes, and which route it actually takes depends on routing decisions taken at each hop along the way: "the route I'd normally send you is slow or broken, so go over... there, and ask them to send you on your way".

It's at those switches that (one layer of) control would happen.  In a highly interlinked network such as, for example, the intra-UK network, you'd have to control all of these switches in order to control the network -- that's infeasible.  But there are fewer international links, and the endpoints of those land in fewer still physical locations (see LINX:  http://en.wikipedia.org/wiki/London_Internet_Exchange  for an example), so control of these would be required.  This is the sort of place -- a chokepoint for a large network's communications with its internetwork links -- where the great firewall of China would physically be.

The sort of controls working at this layer are primarily regulatory controls.  Current proposals (yet again resuscitated by the current government, from previous governments) are concerned with things like requiring the operators of 'backbone' ISPs to keep various categories of traffic logs, and to install various sorts of 'tap' giving access to to the security services.  If you're interested, I suspect you'd find more at the excellent  http://www.openrightsgroup.org/

The stuff being described in the Guardian series is broader than this, and at higher levels: walled gardens, the way that the film and music industries are hacking IP law to arrogate to themselves more and more control over networks, and legislative innovations in things like updated defamation laws.  These aren't really _internet_ controls, but instead social ones, and they tend to be centralised.  That centralisation is either in the form of national or multinational legislative innovation, walled gardens such as the amazingly retro Facebook, or dominant service companies such as Google (companies such as Apple or Amazon are commercially dominant, but don't I think have the interest in social dominance, or the power to shape perceptions, that search companies like Google have).

Parenthetically, it occurs to me that that dominance might be, if not inevitable, at least more likely than hitherto.  The intangibility of an internet business, as well as the weird collapsing of time, mean that it's much much easier for an online business to grow to being planet-scale, than it would be for a bricks-and-mortar one.

Best wishes,

Norman

-- 
Norman Gray  :  http://nxg.me.uk
SUPA School of Physics and Astronomy, University of Glasgow, UK



--[2]------------------------------------------------------------------------
        Date: Tue, 17 Apr 2012 14:12:07 -0400
        From: Gary W Shawver <gs74 at nyu.edu>
        Subject: Re: [Humanist] 25.894 battle for the Internet?
        In-Reply-To: <20120417044011.664AD2787B7 at woodward.joyent.us>


Hi Willard,

Technically, it would not be that difficult given control of upper-level
ISPs (Internet Service Providers) and Domain Name Servers. A user in China,
for example, would request a URL that the local DNS servers resolve to a
government-approved host or for which they have no entry at all, in which
case the user would get a "server not found" error. All the while local
ISPs would be doing real-time analysis of IP packets for forbidden or
suspicious words, URLs, etc., which would be flagged for inspection by
local authorities.

Legally, making the most lax interpretation of libel an
internationally-enforceable norm would effectively shut down speech on the
Internet since even if bloggers etc. could not be sued in their own
countries, they might risk prosecution in other jurisdictions.

I am oversimplifying things quite a bit, but control would be be along
these lines.

Best,

Gary


-- 
Gary Shawver, Ph.D.
Manager
Teaching and Learning Services
Information Technology Services
New York University
212-998-3072
shawver at nyu.edu



--[3]------------------------------------------------------------------------
        Date: Tue, 17 Apr 2012 16:15:41 -0600
        From: Michael Burden <mburden at ualberta.ca>
        Subject: Re: [Humanist] 25.894 battle for the Internet?
        In-Reply-To: <20120417044011.664AD2787B7 at woodward.joyent.us>


I think the chief issue is that most internet data relies on infrastructure
owned by interested parties, or parties subject to traditional legal
measures, and the internet's decentralized protocols were intended for
functional continuity in the face of damage rather than for individual
freedom. Some examples:

Network providers can 'sniff' traffic and inspect or lose certain data
(i.e. a country could block all Skype traffic, or an ISP could slow traffic
from competing websites). Encryption is a partial answer, but encrypted
traffic is still subject to some sniffing (i.e. encrypted Skype looks
different to encrypted bit torrent)

Data delivery is dependent on the domain name system, which has various
faults (for instance, 15% of all internet traffic was briefly routed
through China a few years ago, and various techniques exist for spoofing a
domain name).

Domain names are assigned by entities that fall under the laws of their
country, and those courts can order domain names to be seized (and then
removed from the DNS) which makes the websites essentially invisible.

Cloud-stored data resides on physical servers that also fall under their
local laws (as in the case of the MegaUpload seizure in New Zealand).

There are other pressure points too, since almost all websites have
connections outside the internet. The credit card companies are subject to
law, and so removing a website's financial income is as simple as ordering
credit card companies to refuse to handle that website's transactions (see
Wikileaks) (btw Bitcoin is one suggested alternative, though it has its own
vulnerabilities)

Or web organisations can face regulatory requirements. One aspect of the
proposed U.S. SOPA law was for websites to proactively regulate all their
content, even links posted by the anonymous crowd (making Wikipedia
impossible).

There's many other examples, but I hope that these few provide some useful
indications (and I hope the heavily condensed explanations aren't
misguiding).

-Michael



More information about the Humanist mailing list