[Humanist] 27.654 puppets in a wired world?

Humanist Discussion Group willard.mccarty at mccarty.org.uk
Sat Dec 28 12:24:09 CET 2013

                 Humanist Discussion Group, Vol. 27, No. 654.
            Department of Digital Humanities, King's College London
                Submit to: humanist at lists.digitalhumanities.org

        Date: Sat, 28 Dec 2013 11:06:24 +0000
        From: Willard McCarty <willard.mccarty at mccarty.org.uk>
        Subject: puppets in a wired world

Many here will, I expect, be interested in an article from the New York 
Review of Books, 7 November 2013, that I put away then to read later, 
which turned out to be now. It is Sue Halpern's "Are we puppets in a 
wired world?", still available at 
Allow me to quote from a few paragraphs at the end, where Halpern's 
attention turns to the ambitions of DARPA in "its quest for an algorithm 
that will sift through all manner of seemingly disconnected Internet 
data to smoke out future political unrest and acts of terror". She 
quotes from Viktor Mayer-Schönberger and Kenneth Cukier's Big Data: A 
Revolution That Will Transform How We Live, Work, and Think. To anyone 
who has read the promotional journalism and fellow-traveller academic 
literature from the 1960s onward, the future tense of Schönberger and 
Cukier's title, plus the words "revolution" and "transform", already 
tell the tale. But in case you haven't, here's Halpern's quotation:

> In the future—and sooner than we may think—many aspects of our world
> will be augmented or replaced by computer systems that today are the
> sole purview of human judgment…perhaps even identifying “criminals”
> before one actually commits a crime.

She comments,

> The assumption that decisions made by machines that have assessed
> reams of real-world information are more accurate than those made by
> people, with their foibles and prejudices, may be correct generally
> and wrong in the particular; and for those unfortunate souls who
> might never commit another crime even if the algorithm says they
> will, there is little recourse. In any case, computers are not
> “neutral”; algorithms reflect the biases of their creators, which is
> to say that prediction cedes an awful lot of power to the algorithm
> creators, who are human after all....
> But the real bias inherent in algorithms is that they are, by nature,
> reductive. They are intended to sift through complicated, seemingly
> discrete information and make some sort of sense of it, which is the
> definition of reductive. But it goes further: the infiltration of
> algorithms into everyday life has brought us to a place where metrics
> tend to rule. This is true for education, medicine, finance,
> retailing, employment, and the creative arts. There are websites that
> will analyze new songs to determine if they have the right stuff to
> be hits, the right stuff being the kinds of riffs and bridges found
> in previous hit songs.

So, we are not the only ones to be suffering under the whip of impact. 
Let the boosters of big data take note, or more than note; let them 
acquire the ability to think and act critically. 

But I give the last admonitory words to Halpern:

> There is so much that has been good—which is to say useful,
> entertaining, inspiring, informative, lucrative, fun—about the
> evolution of the World Wide Web that questions about equity and
> inequality may seem to be beside the point.... But while we were
> having fun, we happily and willingly helped to create the greatest
> surveillance system ever imagined, a web whose strings give
> governments and businesses countless threads to pull, which makes
> us…puppets. The free flow of information over the Internet (except in
> places where that flow is blocked), which serves us well, may serve
> others better.


Willard McCarty (www.mccarty.org.uk/), Professor, Department of Digital
Humanities, King's College London, and Research Group in Digital
Humanities, University of Western Sydney

More information about the Humanist mailing list