[Humanist] 31.356 different from the sum of its parts

Humanist Discussion Group willard.mccarty at mccarty.org.uk
Thu Oct 12 10:01:46 CEST 2017

                 Humanist Discussion Group, Vol. 31, No. 356.
            Department of Digital Humanities, King's College London
                Submit to: humanist at lists.digitalhumanities.org

        Date: Wed, 11 Oct 2017 10:09:45 +0200
        From: Tim Smithers <tim.smithers at cantab.net>
        Subject: Re:  31.351 different from the sum of its parts
        In-Reply-To: <20171011052136.3E73578FC at s16382816.onlinehome-server.info>

Dear Gabriel and Robin,

Thank you, Gabriel, for your further elaborations, and thank you, Robin, for
pushing some more.

I'll attempt a response, but I'm not an Operating System expert, nor a legal
one.  My experience here lies more with trying to fix robots that have
stopped working.

Gabriel: yes the way the legal realm and the technical realm treat this
situation are different.  Rightly so, I think.  We do want to know who has
legal responsibility, and who is at fault in the eyes of the law, when
technical systems go wrong or fail to do what they are supposed to.

Yes, modern Operating Systems are certainly complicated and some times hard
to understand.  But, even when they run on multi-core processors, it is
still possible, and useful, to understand the functional hierarchy that
causes what the system is doing at any moment.  This hierarchy is not, of
course, fixed, it changes as the Operating System does different things. 
And, if you go all the way down to the hardware levels, to the individual
cores of the multi-core processor, where the hierarchy ends up changes too. 
Sometimes the code of some needed routine runs in one core, at other times,
in another core.  But in designing all this, we try to keep things
separated, independent, and functionally encapsulated, so as to prevent
unintended, and hard to understand, interactions.

The same code may indeed be executed at the same (real) time but on
different cores, but it's these two different execution threads that are the
causal functioning, not the code used. Code, in and of itself, is passive. 
Nothing happens until it is executed in some processor core somewhere at
some time. When this code is designed and built we don't usually need to
know when nor where it will be executed.  That's a job for the Operating
System to decide.  We just need to take care that we build tightly
encapsulated functionality, so that it doesn't have unintended, and hard to
diagnose, side effects.  For me, it's not about intentions -- the things
that are in the heads of the people who build the code --it's about the
functional implementation and how this is made to happen in the right way at
the right time to make well defined things happen.  And, yes, I think these
can be well understood in terms of levels of abstract functionality and
hierarchies.  But I'm sure there are people here more knowledgeable about
Operating Systems and their implementation, who could put me right on this.

Robin: Your's too is a nice point.  And one that, I think, relates to
Gabriel's question about (legal) responsibility. I'd put your example of the
crashing device driver like this.

The cause of the failure of the Update Machine is the device driver
crashing.  The reason the device driver crashes is a poorly prepared
previous software update.  The cause lies in the technical system, and is
what we need to find if we are to fix this fault, and regain well working
software updating. The reason this failure happens lies in the actions of
the people who prepared and issued the update that messed up the device
driver.  There are two stories here: the technical and the human.  There are
always two stories.  Operating Systems, and computers in general, do not
prepare their own updates, not yet, ta least.  People still do this, and
they have responsibilities, and occasionally fail to do their job well.

This 'always two stories' means we have two systems: the technical system,
which is what I have been talking about; and the combined human-technical
system, which is what you and Gabriel are pointing to when you talk of fault
according to the law, and reason why the device driver started crashing. The
technical system is not, I think, well understood as an ecosystem.  The
combined human-technical system may well be an example of an ecosystem.  One
in which the law tries to impose linear responsibility chains for faults and
incorrect working.

I've not tried to cover this combined human-technical system. I am not
qualified to do this.  However, for many good reasons, it is the more
important, the more interesting, and the more difficult system to
understand.  And, if I may say this, it is a matter in which we need plenty
more good Humanist inputs.  There's plenty of technical systems on the way
into our lives, and some already here, that are not, I think, receiving
enough attention in this respect: self-driving cars, for example.  Or things
like Amazon's Alexa, or Apple's Siri, or ...  the list goes on and is

To put the question in a provocative form: why is much of humanity allowing
the digital to dehumanise it so?  Is this an ecosystem at work?  I think so.
It doesn't look like something debugging robots helps with much.

Best regards,


>  On 11 Oct 2017, at 07:21, Humanist Discussion Group
> <willard.mccarty at mccarty.org.uk> wrote:>
>                 Humanist Discussion Group, Vol. 31, No. 351.
>            Department of Digital Humanities, King's College London
>                       www.digitalhumanities.org/humanist
>                Submit to: humanist at lists.digitalhumanities.org
>  [1]   From:    Gabriel Egan <mail at gabrielegan.com>                      (133)
>        Subject: Re: [Humanist] 31.348 different from the sum of its parts
>  [2]   From:    "Burke, Robin" <rburke at cs.depaul.edu>                     (12)
>        Subject: Re: different from the sum of its parts
> --[1]------------------------------------------------------------------------
>        Date: Tue, 10 Oct 2017 11:22:18 +0100
>        From: Gabriel Egan <mail at gabrielegan.com>
>        Subject: Re: [Humanist] 31.348 different from the sum of its parts
>        In-Reply-To: <20171010074238.651007E93 at s16382816.onlinehome-server.info>
> Dear Tim
> I didn't think we could still model the working of a computer in terms of
> such a hierarchy.
> In UK consumer law, for example, it has long been established that although
> the fault in a complex machine may have its root in a particular small
> subcomponent, the seller cannot use that fact to argue that overall the
> machine is functioning and that only a small subcomponent is at fault. If
> the whole machine isn't doing what it's meant to do, the whole machine is
> deemed to be faulty.
> This way of thinking seems not only pragmatically correct from the point of
> view of law, but also technically correct from the point of view of computer
> design. Is it not the case that computers and their operating systems are
> indeed so complex that we can no longer define their interactions by a
> hierarchy of modular operations? The same set of instructions (a single long
> number) pulled in off the hard disk may be running on two processor cores
> simultaneously, and from the point of view of coding 'intention' it seems to
> me that they are only loosely not tightly subordinated to another process
> that is meant to be supervising them. Am I wrong? Can we still model this
> situation as a hierarchy of intentions? I'm genuinely asking for
> enlightenment on this.
> Regards
> Gabriel
> --
> ________________________________________________________________________
> Professor Gabriel Egan, De Montfort University. www.gabrielegan.com
> Director of the Centre for Textual Studies http://cts.dmu.ac.uk
> National Teaching Fellow http://www.heacademy.ac.uk/ntfs
> Gen. Ed. New Oxford Shakespeare http://www.oxfordpresents.com/ms/nos
> --[2]------------------------------------------------------------------------
>        Date: Tue, 10 Oct 2017 12:31:16 +0000
>        From: "Burke, Robin" <rburke at cs.depaul.edu>
>        Subject: Re: different from the sum of its parts
>        In-Reply-To: <20171010074238.651007E93 at s16382816.onlinehome-server.info>
> Just to push the point a little bit farther…
> What if the reason that Windows update stops working is because the computer has some hardware component that is somewhat obscure and Microsoft doesn’t test for compatibility with it when producing updates? Because of this, some previous update has interfered with the functioning of a device driver and now it crashes periodically and this interferes with the operation of the “Update Machine”. (I’m talking real experience here.) So, arguably, a failure on the company’s part to thoroughly test an update is causing the whole updating / patching system to work incorrectly.
> You could still point to the inside of the machine and say “here is the cause: a device driver that keeps crashing”, but I could point back to the update system and say “here is the cause: a failure to test thoroughly, resulting in an incompatible update.” And then the system engineer would point back to my system and say “here is the cause: an obscure peripheral device that isn’t on my compatibility list.” And I could say “here is the cause: a cost-savings measure by the company to restrict the set of tested configurations on the compatibility list.” Or I could point to the peripheral manufacturer. Etc.
> I think it is not a stretch to say that at this point personal computing is a pretty complex ecosystem, and even though each machine is fundamentally deterministic, its correct functioning is often subtly but ultimately dependent on decisions made by other actors: software / hardware companies, hackers, ISPs, etc. Ask any IT person charged with keeping such systems up and running. The causal arrows point in multiple directions.
> Robin
> ———————————————————————————————
> Robin Burke, Professor
> School of Computing, DePaul University
> rburke at cs.depaul.edu

More information about the Humanist mailing list