[Humanist] 30.854 residues from algorithms?

Humanist Discussion Group willard.mccarty at mccarty.org.uk
Thu Mar 30 08:39:31 CEST 2017


                 Humanist Discussion Group, Vol. 30, No. 854.
            Department of Digital Humanities, King's College London
                       www.digitalhumanities.org/humanist
                Submit to: humanist at lists.digitalhumanities.org



        Date: Thu, 30 Mar 2017 08:29:48 +0200
        From: Willard McCarty <willard.mccarty at mccarty.org.uk>
        Subject: residues from algorithms

Recently I've been at a workshop on computational ethnomusicology at the 
Lorentz Center (https://www.lorentzcenter.nl) in Leiden, to talk about 
modelling. As usual I drew attention to the value of anomalies (a.k.a 
the 'residue', as some would say). A lecturer from Queen Mary (London), 
Bob Sturm, described an experiment in processing of music in which the 
algorithm he was using was unusually successful. Looking at the data 
afterwards, however, he discovered that this success was due to a part 
of the audio signal well beyond human hearing -- most likely an artefact 
of the recording technologies. So, in this case, the 'residue' was the 
music, all of it.

Great success from automatic processes for entirely the wrong reason 
leads to the question of results from black-boxes, and so to strategies 
for building confidence in those results. But then we are cognitive 
black-boxes. What's at fault here, one could say, is our attitude toward 
the digital machine, regarded as a jukebox of truth. How we deal with 
each other is the right model for the modelling machine?

Comments?

Yours,
WM

-- 
Willard McCarty (www.mccarty.org.uk/), Professor, Department of Digital
Humanities, King's College London; Adjunct Professor, Western Sydney
University and North Carolina State University; Editor,
Interdisciplinary Science Reviews (www.tandfonline.com/loi/yisr20)




More information about the Humanist mailing list