[Humanist] 31.204 'computational' &c

Humanist Discussion Group willard.mccarty at mccarty.org.uk
Wed Jul 26 10:13:17 CEST 2017


                 Humanist Discussion Group, Vol. 31, No. 204.
            Department of Digital Humanities, King's College London
                       www.digitalhumanities.org/humanist
                Submit to: humanist at lists.digitalhumanities.org



        Date: Tue, 25 Jul 2017 12:26:51 -0400
        From: William L. Benzon <bbenzon at mindspring.com>
        Subject: Re:  31.200 'computational' &c
        In-Reply-To: <20170725062153.807546B7E at digitalhumanities.org>


A response to Willard McCarty:

I have just noticed one of the peculiar aspects of this conversation, the implicit assumption that computation is equivalent to DIGITAL computation. Where did that assumption come from? Why does it seem so, well, natural?

When I first started reading about computers, sometime in the mid-1960s, introductory accounts for the general reader started by distinguishing between ANALOG and DIGITAL computation and went on to discuss both. The first computational device I can remember using extensively was a slide rule, which is analog, though I may have used what was then called an adding machine at one time or another. It was sometime in the 1980s, I believe, that I read an intro-to-computers article that didn’t mention analog computing. It must have been in one of those magazines directed at users and owners of personal computers. And that, perhaps, is why analog computation was dropped from discussion. Readers of the article either owned or had access to a personal computer, and that computer would have been digital, and so that’s what the article explained. In any event, analog computation seems rather distant from most, if not all, treatments of computation these days.

FWIW, I’ve just done a Google Ngram search on the phrases, “analog computer,digital computer” BTW 1900 and 2000:

<https://books.google.com/ngrams/graph?content=analog+computer,+digital+computer,&year_start=1900&year_end=2000&corpus=15&smoothing=2&share=&direct_url=t1;,analog%20computer;,c0;.t1;,digital%20computer;,c0>

They appear more or less together in the early 1940s and track together until the mid-1950s, at which point “digital computer” begins to outstrip “analog computer”. The continue upward until the late 1960s, at which point they both start falling, with “digital” always more prominent than “analog.”

If you just search on “analog,digital” (1900-2000)

<https://books.google.com/ngrams/graph?content=analog,digital&year_start=1900&year_end=2000&corpus=15&smoothing=2&share=&direct_url=t1;,analog;,c0;.t1;,digital;,c0>

you find that both terms float close to the X-axis until about 1950. At that point both terms increase in frequency through 2000 with “digital” far outstripping “analog”.

The last thing von Neumann wrote is the posthumous The Computer and the Brain (1958). Its topic, I believe, is how to implement computation in a physical device and von Neumann developed it through the contrast between analog and digital. As Terrence Sejnowski put it [1]:

> The all-or-none nature of the action potential had suggested
> analogies with binary gates in digital computers (McCulloch and Pitts
> 1943), but the analog nature of neural integration was just beginning
> to be fully appreciated. Typically, the accuracy of numerical
> calculation in a modern digital computer is 8 to 16 significant
> figures. But in a neuron, signaling by means of the average firing
> rate has at best one or two significant figures of accuracy. [...]
> Von Neumann recognized that the reliance of the brain on
> analog-signal processing had far-reaching significance for the style
> of computation that the brain was capable of supporting. He pointed
> out that the logical depth of a calculation, for example, can be very
> great for a digital computer that retains high accuracy at each step
> in the calculation; but for an analog system like the brain, the
> compounding of errors causes severe problems after only a few steps.

Sejnowki concludes his review by quoting von Neumann’s final line:

> However, the above remarks about reliability and logical and
> arithmetical depth prove that whatever the system is, it cannot fail
> to differ considerably from what we consciously and explicitly
> consider mathematics.

And we are still there today, six-decades later.

But if digital computation tends to dominate the discussion, analog has by no means been forgotten and has certainly influenced my thinking. Early in my career David Hays introduced me to a book by William Powers, Behavior: The Control of Perception (1973), which developed a sophisticated analog account of the mind. Somewhat later, when I was working on my book about music (Beethoven’s Anvil, 2001 [2]) I was in close communication with the late Walter Freeman [3], a neuroscientist at Berkeley. Freeman’s program consisted of 1) experimental observation, 2) mathematical analysis, and 3) computer simulation. He was interested in the use of complex dynamics to understand the brain at the level of neural nets and was skeptical of digital computation as a model for thought. Finally, I would mention the work of Carver Mead, a Cal Tech scientist and engineer who developed the concept of neuromorphic engineering [4], the development of analog circuits in VLSI chips to mimic neuro-biological systems. [I make no use of this work myself, but I mention it to make the point that analog concepts remain alive and well. Indeed, essential.]

As for Willard’s question “Why 'machine' and not some other term?”, what other term do we have? Of course it’s not so much the term, but the reservoir of concepts on which we can draw to make our models. And it’s stretching the term rather extremely to think of these complex electronic devices as being of the same species as bicycles, clocks, printing presses, and automobiles. They’re very different creatures. Is that a mistake, to think of them as machines?

To some extent these are questions of mere semantics. We shouldn’t let them get in the way of the substantive matters at stake, which are as deep as any before us.

Yours,

BB 

[1] Terrence Sejnowski. The Computer and the Brain Revisited. Annals of the History of Computing, Vol. 11, No, 3, 1989, 197-201.

[2] William Benzon. Beethoven’s Anvil: Music in Mind and Culture, Basic Books 2001. Final drafts of the second and third chapters are available online at, https://www.academia.edu/232642/Beethovens_Anvil_Music_in_Mind_and_Culture <https://www.academia.edu/232642/Beethovens_Anvil_Music_in_Mind_and_Culture>
[3] Here’s a short biography of Freeman: https://neural.memberclicks.net/assets/docs/wjf_obituary2.pdf

You can find a number of his papers online at Research Gate: https://www.researchgate.net/profile/Walter_Freeman2

[4] Wikipedia article on Mead: https://en.wikipedia.org/wiki/Carver_Mead

Wikipedia article on neuromorphic engineering: https://en.wikipedia.org/wiki/Neuromorphic_engineering

Bill Benzon
bbenzon at mindspring.com

646-599-3232

http://new-savanna.blogspot.com/  http://new-savanna.blogspot.com/
http://www.facebook.com/bill.benzon  http://www.facebook.com/bill.benzon
http://www.flickr.com/photos/stc4blues/  http://www.flickr.com/photos/stc4blues/
https://independent.academia.edu/BillBenzon <https://independent.academia.edu/BillBenzon>
http://www.bergenarches.com/#image1  http://www.bergenarches.com/#image1





More information about the Humanist mailing list