[Humanist] 30.138 what is theory?

Humanist Discussion Group willard.mccarty at mccarty.org.uk
Fri Jul 1 13:20:49 CEST 2016


                 Humanist Discussion Group, Vol. 30, No. 138.
            Department of Digital Humanities, King's College London
                       www.digitalhumanities.org/humanist
                Submit to: humanist at lists.digitalhumanities.org

  [1]   From:    Dino Buzzetti <dino.buzzetti at gmail.com>                   (69)
        Subject: Re:  30.135 a researcher not to do research, or what is
                'theory'?

  [2]   From:    Bill Pascoe <bill.pascoe at newcastle.edu.au>               (202)
        Subject: Re:  30.136 what is theory?


--[1]------------------------------------------------------------------------
        Date: Thu, 30 Jun 2016 15:05:49 +0200
        From: Dino Buzzetti <dino.buzzetti at gmail.com>
        Subject: Re:  30.135 a researcher not to do research, or what is 'theory'?
        In-Reply-To: <20160629104210.C44C478E4 at digitalhumanities.org>


Dear Willard,

I think that whatever definition you may choose for
"theory", what it boils down to is calling your own point
of view into question.  In other words, probing the
foundations of what you are doing.

Best,                -dino buzzetti

On 29 June 2016 at 12:42, Humanist Discussion Group <
willard.mccarty at mccarty.org.uk> wrote:

>                  Humanist Discussion Group, Vol. 30, No. 135.
>             Department of Digital Humanities, King's College London
>                        www.digitalhumanities.org/humanist
>                 Submit to: humanist at lists.digitalhumanities.org
>
>
>
>         Date: Wed, 29 Jun 2016 06:32:37 -0400
>         From: Willard McCarty <willard.mccarty at mccarty.org.uk>
>         Subject: theory
>
>
> Paul Fishwick has in Humanist 30.129 usefully pointed out that 'theory'
> is not a disciplinary universal but takes on different meanings in
> different contexts. So, for digital humanities, we might say that
> 'theory' is under-theorized, or being less cute, that what we mean by it
> needs more thought. One exercise I did once and can heartily recommend
> is to follow the word as it is used from the physical sciences into the
> humanities, stopping especially in physics, economics, sociology,
> anthropology, history and literary studies, then looking at philosophy,
> computer science and finally digital humanities. That should be enough.
>
> Casually, as I hear the word used by those I usually encounter, 'theory'
> means any speculative thought related to computing that does not express
> itself in programming. I'm frequently cast as the house theoretician
> when all I think I am doing is wanting to have a conversation with
> others about what it is that's going on, what we're doing, what we could
> be doing etc. In other words, I think the word is being used very
> sloppily indeed. Does this sloppiness come, perhaps, from what Jonathan
> Culler once called 'just theory', or what others have called
> 'theory-with-a-capital-T'?
>
> Clifford Geertz's agonized meditation on the lack of theory in
> anthropology, in "Thick description" (1973), is to my mind a good
> example of an attempt to hammer out a term better than 'theory' for a
> discipline in which it did not fit. When (with apologies to Raymond
> Carver) we talk about the collision between computing and a problem in
> scholarship, what are we talking about? I think that's a much better
> question than the meaning of 'theory' in digital humanities.
>
> Comments?
>
> Yours,
> WM
> --
> Willard McCarty (www.mccarty.org.uk/), Professor, Department of Digital
> Humanities, King's College London; Adjunct Professor, Western Sydney
> University


-- 
Dino Buzzetti                                          formerly
Department of Philosophy     University of Bologna
​                                ​
                             currently
Fondazione per le Scienze Religiose Giovanni XXIII
​
via san Vitale, 114                   I-40125 Bologna BO
e-mail:  dino.buzzetti(at)gmail.com
             buzzetti(at)fscire.it
web: http://web.dfc.unibo.it/buzzetti/
http://www.fscire.it/it/home/chi-siamo/ricercatori/buzzetti/



--[2]------------------------------------------------------------------------
        Date: Fri, 1 Jul 2016 02:11:40 +0000
        From: Bill Pascoe <bill.pascoe at newcastle.edu.au>
        Subject: Re:  30.136 what is theory?
        In-Reply-To: <20160630114254.8882D791F at digitalhumanities.org>


Hi,

The question of theory seems to have generated some discussion, so here's my two cents.

I think Andrew Taylor is right to highlight the explorative aspects of DH practice. Some DH is focused on a straightforward use of existing tech to achieve some worthwhile goal for a project - such as creating an online archive of some important images and text. It's easy to state the project, the benefit, schedule the work and demonstrate success, so it's easy to ask for funding. Other activities are difficult to propose or justify as a project because to say what they are about and what they will achieve, would mean you had already finished them. They are playful and explorative and you have no idea whether they're going to achieve anything at all.

As an example, I have a personal project that explores the philosophy of free will, cognition and learning through AI. One of the most important aspects of it is a pragmatic, 'try it and see' approach. By trying to implement the philosophy in AI you expose flaws in the theory, and/or shortcomings of present day IT, or demonstrate how it works or what's missing. You learn things you wouldn't have otherwise and there is a cycle of play where the attempt modifies the theory and you make a new attempt. The premise is that if it's possible to implement 'being human' in AI, then rather than debate whether it's possible, I'll find out quicker by attempting to do it and learn a lot along the way. This doesn't mean you ignore theory and critique, that is essential to avoid naive assumptions about what 'cognition' etc is (which remains a problem in AI and robotics without a philosophical basis) but I'll end up with a critique that considers many factors that I wouldn't have recognised if I hadn't tried. Perhaps we could call this sort of thing, 'Speculative DH'.

I entered IT after a very post-structuralist undergrad degree in English and Philosophy so I'm someone "for whom collision of a scholarly mind with computing has struck some sparks". Software immediately struck me as being an implementation of structuralism. Since I'd been taught to critique stucturalism, I figured I should find software development easy. But what would post-structuralist software look like? What is presence and differance in a computer? In what sense can binary logic be historically contingent? My earliest (fruitless) experiments in AI revolved around what I now realise were n-tuples (I didn't know there was such a thing as DH at the time), processing text strings on the premise from Semiotics, of 'meaning' being determined by context, wondering how a computer might 'read' the world if it's perceptual input was a stream of symbols. A lot of higher order IT phenomena easily fit into a post-structuralist and postmodern view - decentered networks (the internet, torrents, etc), and social media as decentered authorship, reader constructed ironic pastiche etc etc. But it's hard to see how it would work at the lower level. Computers work so well because of the structuralist paradigm of programming languages.

The basic principles of information theory are also interesting from a post-structuralist perspective, particularly in relation to big data, and in a way that has social effects. As I understand Shannon, 'Information' occurs at the moment of disambiguation when a symbol in a stream appears. Each symbol in a pre-defined set has a probability of occurring and that is the quantified amount of 'information' that occurs when that symbol appears. Once the moment passes and we know it is the letter 'e' it's not information any more, it's knowledge. Before that, let's call it data. There are always caveats that this theory is not about meaning, or anything else except symbolic series from pre-defined sets, but it seems to provide a good analogy for a great many things. It is also interesting to apply a Derridean critique of 'presence' to Info Theory since it's the moment of disambiguation (the present) that is the important part. None the less, the whole system is not incompatible with differance because it relies on the existence of a predefined symbol set to which we must defer for the symbol to have sense.

Those are just a few sketchy musings but it seems a good starting point for theorising Big Data because there is no *information* in all these masses of data we are accumulating until something is disambiguated to a person. A question is answered, a document retrieved and read, etc. It's important because there is so much noise in big data - what signals we get out depends on what assumptions we make - the predefined set of answers from among which we expect a disambiguation. (Even though we often claim that data mining can discover things beyond our assumptions, there remain assumptions about how to do data mining, which data to run it on, and in our interpretation of the reports etc). What potential information, or who, is lost in that Big Noise?

The fear of Big Data analysis as detracting from the 'human' element of reading and experiencing a text (If you want to do humanities isn't it better to read 'Hamlet' than to include it in a computer analysis?) also hinges on the Information Theory model of our experiential process of information flow from data through the disambiguation (and affect) of it, to memory/past/knowledge, happening in a human reading. It's interesting that Information Theory puts the human experience right at the centre of the process, though I suppose that moment of 'disambiguation' could be construed mechanistically in a switching system - and so our theoretical excursion could easily segue into interactor networks, post-humanism etc. and also into whether that 'disambiguation' really means 'significant to a human' or a physical cause and effect which may or may not be the electrico-chemical reactions involved in the recognition of a symbol.

If there is 'nothing outside the text', ie everything is a 'text', and we are always interpreting meaning, it's hard not to slip into seeing Information Theory as a parsimonious model for being in the world: as we are uncertain of the future, phenomena keep streaming in and become a known, unchangeable past, nothing interpretable without our prior 'set' of experiences. So is it because of phenomenology, ala Husserl, and the modernist 'stream of consciousness', that we have Information Theory, and that this has come to be the 'received view', the model, of being in the world?...

Kind regards,

Dr Bill Pascoe
eResearch Consultant
Digital Humanities Lab
Centre for 21st Century Humanities<http://www.newcastle.edu.au/research-and-innovation/centre/centre-for-21st-century-humanities/about-us>

T: 0435 374 677
E: bill.pascoe at newcastle.edu.au

The University of Newcastle (UON)
University Drive
Callaghan NSW 2308
Australia



More information about the Humanist mailing list