[Humanist] 29.549 faking it or revealing it?
Humanist Discussion Group
willard.mccarty at mccarty.org.uk
Mon Dec 14 08:46:29 CET 2015
Humanist Discussion Group, Vol. 29, No. 549.
Department of Digital Humanities, King's College London
Submit to: humanist at lists.digitalhumanities.org
Date: Mon, 14 Dec 2015 07:30:22 +0000
From: Willard McCarty <willard.mccarty at mccarty.org.uk>
Subject: faking it or revealing it?
In the mid 1990s, in his book Claiming the Real: The Griersonian Documentary
and its Legitimations (British Film Institute) Brian Winston could write
about photography and cinema that, "Absolute undetectability, for the first
time, is undermining the mimetic power of all photographic processes" (p.
6). Even then, twenty years ago, simulation in many of the sciences had long
been the means by which research was done and by which it succeeded, e.g. in
the making of the thermonuclear bomb. As Peter Galison wrote, bit by bit,
byte by byte, the computer became the nature that we study.
This process has a very long history (or pre-history, if you wish) in
analogical reasoning, which has come to our aid since before philosophy
began, whenever our minds would reach to something, such as the stars, or
the origins of the cosmos, but our empirical tools cannot. Because it is a
reach of mind, a kind of disciplined imagination, unease, even crises of
rationality, have followed analogy wherever it has surfaced. In recent times
archaeologists, for example, have been very uneasy indeed over the
uncertainty of analogical reasoning, or because of "the paralyzing demand
for certainty", as Alison Wylie insists in her magnificent book, Thinking
from Things: Essays in the Philosophy of Archaeology (2002), esp in her
included essay, "The Reaction Against Analogy". Consider that archaeology
works with material objects but attempts with them analogically to
understand how they were used by people long dead in cultures long gone.
What a leap! As one despairing archaeologist wrote in the 1950s, "the more
human, the less intelligible".
And now the literary folk. Work with big textual data is often clothed in
the rhetoric of revelation. Once we were lost, thinking that a handful (ok,
a generous handful) of canonical 19C novels could lead us by reading them
closely to understanding of the novel. Now we have found that with
collections of literary Big Data we can abandon the foolishness of close
reading to see in numbers and charts how the rejected and despised majority
of novels give us a truer picture. Once again analogical reasoning, through
statistical modelling, interpenetrates throughout. It's not quite that we
are making it all up, and certainly not that we are wasting our time (unless
we fail to be critical about the processes and results) but neither are we
seeing at last the one true picture. Perhaps we need to stress again, to
ourselves as well as to our students, that "evidence" is what argument does
to data to make it meaningful?
Willard McCarty (www.mccarty.org.uk/), Professor, Department of Digital
Humanities, King's College London, and Digital Humanities Research
Group, University of Western Sydney
More information about the Humanist