[Humanist] 23.128 programming

Humanist Discussion Group willard.mccarty at mccarty.org.uk
Sat Jul 4 11:37:16 CEST 2009


                 Humanist Discussion Group, Vol. 23, No. 128.
         Centre for Computing in the Humanities, King's College London
                       www.digitalhumanities.org/humanist
                Submit to: humanist at lists.digitalhumanities.org

  [1]   From:    Joris van Zundert <joris.van.zundert at gmail.com>           (77)
        Subject: Re: [Humanist] 23.120 programming

  [2]   From:    James Rovira <jamesrovira at gmail.com>                      (29)
        Subject: Re: [Humanist] 23.125 programming


--[1]------------------------------------------------------------------------
        Date: Fri, 3 Jul 2009 10:11:21 +0200
        From: Joris van Zundert <joris.van.zundert at gmail.com>
        Subject: Re: [Humanist] 23.120 programming
        In-Reply-To: <20090702072632.9B28C2AB6E at woodward.joyent.us>


Hi all,

JIm R. wrote:

> Coding itself does not "pass along complex truths," and even if it
> did, no one would have access to these truths except programmers.
> Coding issues commands -to a machine-, which creates an interface with
> a user. Equating "coding" with "software" is a serious confusion of
> terms in this case.  Software is the interface, coding is what makes
> it work.  The interface itself may look no different from a literary
> product: every poem you read on a computer screen is there because of
> coding, but it's the poem, not the coding, that communicates complex
> truths.
>

Code is written in computer language. It's called computer *language*,
because that's exactly what it is: language, semiotics, a bunch of signs to
convey meaning. As with any other language you can make it stylish, raw,
express dumbness or elegance.

The one difference is that it has strict grammar. (You can expand
vocabulary, no difference there: making a new word is just defining another
function.) Strict grammar allows interpretation by a compiler or any other
formal interpreter that may create worlds out of the code by visualization,
gui etc. In those worlds anything can happen that we can imagine, because we
can express that in code. Yes, also ambiguity and uncertainty can be
*modeled*. They're approximations, simulations of ambiguity, but it would be
hard, if not impossible, for humans to tell the difference when not knowing
they were toying with a computer.

This is what makes computer code/language a double treat to me. It may be
poetry by itself for the initiated (like the beauty of an elegant equation
may only be in the eye of the mathematician). As an interpreted language (by
computers) it may convey or visualize poetry in other (human) languages,
simulate worlds, allow for creative human human interaction, inspire,
provoke thought... really anything. So, I'd say it's actually more
'powerful' than human language.

I can see no logical reason why at least some parts of the complex truths
Jim is talking about, shouldn't be exactly expressible by code. Apparently
Jim would find that kind of expression boring, but that's really a question
of taste, not of capabilities and functions of code.

Just a thought.

Cheers,
Joris

PS I
'True ambiguity' in software code exist, it's just not loved by coders
generally because in the end code is about clearness and logic to construct
and build. But yes, one can make a data class -bad idea, but that's another
discussion- 'Circle'. That would be a highly ambiguous class. It might
contain the names of users that I trust, or it might contain information on
just a geometrical circle. I can imagine that this class would have a
function '.rotate()'. If the class was to represent a 'bunch' of users, this
would mean the order would change, if it was a geometrical circle, well you
know the routine. The ambiguity could go as far as me as a coder only
realizing that I was fooled when calling the function '.display()', only
then it would become apparent to me that I was using a class that I didn't
mean to use. Now, as said, generally coders don't like this sort of
unclearnes, because it slows down there productiveness. So they would call
the former 'CircleOfTrust', having functions like '.reorder()', '.in( user
)', '.out( user )', etc. The order one would probably be 'CircleXY( x, y, r
)'.
And if you want to go *really* ambiguous: you can have the class CircleXY
generate visual rectangles, which would only occur to the developers when
users started to call in something strange was happening. The point of doing
that (apart from questionable levels of fun) however, escapes me.

PS II
Hard core Informatics have been playing around with ambiguity/uncertainty
down to the hardware level. The famous bit can only represent two values.
But there are ways to hardwire uncertainty. One is that flipflops can have
ambiguous state (neither 0 or 1, just 'uncertain'). As far as I know there's
not a practical application for that yet - but I'm certainly no expert here.
Other ambiguities come in play of course in quantum computing.


--[2]------------------------------------------------------------------------
        Date: Fri, 3 Jul 2009 08:37:39 -0400
        From: James Rovira <jamesrovira at gmail.com>
        Subject: Re: [Humanist] 23.125 programming
        In-Reply-To: <20090703051513.253912EC33 at woodward.joyent.us>

Yes, the following paragraph does get to a root difference in our outlooks:

> I think the concern that I'm equivocating or confusing code with
> software comes from a basic prejudice on my part that the software
> user is interacting with code, that there is no difference between
> what the programmer wrote and what the user is interacting with,
> whether it went through a compiler or not.

I agree the software user is interacting with code when using
software.  The point is that s/he is not usually interacting
-directly- with code.  Most software users know nothing about code
and, as you say, the computer is a magic black box that just does
things when you type or touch the screen.

And yes, I probably am rambling, but I do mean this:

<<I believe you're rambling here, so I'll assume you didn't actually
mean that a majority of people alive have no need of computers.>>

The majority of people alive today do not live in fully industrialized
countries, do not have computers, and probably never will.  They all
do have language, whether it is written or not.  In the history of
technology, written language was considered grossly subordinate to
spoken language, writing inferior to speech, and really unnecessary to
communicate meaningful truths.

But the parallel you attempt to draw between coding and language on
this point still doesn't work.  We don't think in code.  Our major
philosophical, religious, and literary texts were not written in code.
 Our major works of art are not painted in code.  Code itself does not
communicate the truths these works communicate.  It simply reproduces
a visual equivalent of these works in electronic form.  Saying code
communicates meaningful truth is roughly equivalent to saying pens and
paper communicate meaningful truth.

Jim R





More information about the Humanist mailing list