[Humanist] 23.276 an ethical problem

Humanist Discussion Group willard.mccarty at mccarty.org.uk
Tue Sep 8 07:22:52 CEST 2009

                 Humanist Discussion Group, Vol. 23, No. 276.
         Centre for Computing in the Humanities, King's College London
                Submit to: humanist at lists.digitalhumanities.org

        Date: Mon, 07 Sep 2009 13:55:17 +0100
        From: Willard McCarty <willard.mccarty at mccarty.org.uk>
        Subject: an ethical problem

Allow me to pass on to you, below, quite a long quotation, from the end 
of the Introduction to Norbert Wiener's Cybernetics, or Control and 
Communication in the Animal and the Machine (Cambridge MA: The 
Technology Press, 1947), pp. 36-9. I find it very moving, and although 
many of our social conditions have changed, and our predicaments are 
somewhat different, I also find myself wanting to be in a seminar in 
which this text was under sustained discussion.

Comments are, as usual, most welcome.


> It has long been clear to me that the modern ultra-rapid computing
> machine was in principle an ideal central nervous system to an
> apparatus for automatic control....
> Long before Nagasaki and the public awareness of the atomic bomb, it
> had occurred to me that we were here in the presence of another
> social potentiality of unheard-of importance for good and for evil.
> The automatic factory, the assembly-line without human agents, are
> only so far ahead of us as is limited by our willingness to put such
> a degree of effort into their engineering as was spent, for example,
> in the development of the technique of radar in the second world war.
> I have said that this new development has unbounded possibilities for
> good and for evil. For one thing, it makes the metaphorical dominance
> of the machines, as imagined by Samuel Butler, a most immediate and
> non-metaphorical problem. It gives the human race a new and most
> effective collection of mechanical slaves to perform its labor. Such
> mechanical labor has most of the economic properties of slave labor,
> although, unlike slave labor, it does not involve the direct
> demoralizing effects of human cruelty. However, any labor that
> accepts the conditions of competition with slave labor, accepts the
> conditions of slave labor, and is essentially slave labor. The key
> word of this statement is competition. It may very well be a good
> thing for humanity to have the machine remove from it the need of
> menial and disagreeable tasks; or it may not. I do not know. It
> cannot be good for these new potentialities to be assessed in the
> terms of the market, of the money they save....
> Perhaps I may clarify the historical background of the present
> situation if I say that the first industrial revolution, the
> revolution of the «dark satanic mills», was the devaluation of the
> human arm by the competition of machinery. There is no rate of pay at
> which a United States pick-and shovel laborer can live which is low
> enough to compete with the work of a steam shovel as an excavator.
> The modern industrial revolution is similarly bound to devalue the
> human brain at least in its simpler and more routine decisions....
> [T]aking the second revolution as accomplished, the average human
> being of mediocre attainments or less has nothing to sell that it is
> worth anyone's money to buy.
> The answer, of course, is to have a society based on human values
> other than buying or selling....
> Those of us who have contributed to the new science of cybenetics
> thus stand in a moral position which is, to say the least, not very
> comfortable. We have contributed to the initiation of a new science
> which, as I have said, embraces technical developments with great
> possibilities for good and for evil. We can only hand it over into
> the world that exists about us, and this is the world of Belsen and
> Hiroshima. We do not even have the choice of suppressing these new
> technical developments. They belong to the age, and the most any of
> us can do by suppression is to put the development of the subject
> into the hands of the most irresponsible and most venal of our
> engineers. The best we can do is to see that a large public
> understands the trend and the bearing of the present work, and to
> confine our personal efforts to those fields, such as physiology and
> psychology, most remote from war and exploitation. As we have seen,
> there are those who hope that the good of a better understanding of
> man and society which is offered by this new field of work may
> anticipate and outweigh the incidental contribution we are making to
> the concentration of power (which is always concentrated by its very
> conditions of existence, in the hands of the most unscrupulous). I
> write in 1947, and I am compelled to say that it is a very slight
> hope.

Norbert Wiener, Instituto Nacional de Cardologia, Cuidad de Mexico, 
November 1947.

Willard McCarty, Professor of Humanities Computing,
King's College London, staff.cch.kcl.ac.uk/~wmccarty/;
Editor, Humanist, www.digitalhumanities.org/humanist;
Interdisciplinary Science Reviews, www.isr-journal.org.

More information about the Humanist mailing list