[Humanist] 26.650 backronyms and synapses

Humanist Discussion Group willard.mccarty at mccarty.org.uk
Fri Jan 4 10:00:40 CET 2013

                 Humanist Discussion Group, Vol. 26, No. 650.
            Department of Digital Humanities, King's College London
                Submit to: humanist at lists.digitalhumanities.org

        Date: Fri, 04 Jan 2013 08:51:09 +0000
        From: Willard McCarty <willard.mccarty at mccarty.org.uk>
        Subject: backronyms and synapses

This morning, reflecting on today's lineup for Humanist, I thought to 
check on the progress of the IBM/DARPA project, SyNAPSE, i.e. Systems of 
Neuromorphic Adaptive Plastic Scalable Electronics. I had not previously 
known that backward formation of acronyms (which, one suspects, is how 
many acronyms are coined) had a term to name it, but it does -- 
"backronym", which even has a Wikipedia entry.

All this would have passed unremarked on Humanist were it not for the 
progress that SyNAPSE has in fact made since I mentioned it on Humanist 
some time ago. See 
http://www.artificialbrains.com/darpa-synapse-program. Note the ultimate 
aim of the project,

> to build an electronic microprocessor system that matches a mammalian
> brain in function, size, and power consumption. It should recreate 10
> billion neurons, 100 trillion synapses, consume one kilowatt (same as
> a small electric heater), and occupy less than two liters of space.

It is illuminating to consider these engineering requirements in light 
of John von Neumann's detailed comparison of natural and artificial 
automata in The Computer and the Brain (1958) as to speed, energy 
requirements and size. Note that the size differential (which von 
Neumann regarded as crucial) has gone from 1/10**8 in favour of the 
natural system, which was then that much smaller, to a factor of less 
than 1/2 (the human brain being ca "of the order magnitude of a liter... 
i.e. of 10**3 cm", p. 48).

In "The NORC and problems of high-speed computing" (on the occasion of 
the first public showing of IBM's Naval Ordnance Research Calculator in 
1954), von Neumann wrote,

> In planning new computing machines, in fact, in planning anything
> new... it is customary and very proper to consider what the demand
> is, what the price is, whether it will be more profitable to do it in
> a bold way or in a cautious way, and so on. This type of
> consideration is certainly necessary. Things would very quickly go to
> pieces if these rules were not observed in ninety-nine cases out of a
> hundred.
> It is very important, however, that there should be one case in a
> hundred where it is done differently... to write specifications
> simply calling for the most advanced machine which is possible in the
> present state of the art. I hope that this will be done again soon
> and that it will never be forgotten.

It seems that von Neumann's hope wasn't in vain. But in light of such 
developments and especially the emergent desire that drives them, what 
is our role?


Willard McCarty, FRAI / Professor of Humanities Computing & Director of
the Doctoral Programme, Department of Digital Humanities, King's College
London; Professor, School of Humanities and Communication Arts,
University of Western Sydney; Editor, Interdisciplinary Science Reviews
(www.isr-journal.org); Editor, Humanist
(www.digitalhumanities.org/humanist/); www.mccarty.org.uk/

More information about the Humanist mailing list