re: Asimov, consciousness etc.

W Ramsay (
Wed, 18 Oct 1995 12:55:57 +0100

Devi Jankowicz, in reply to my earlier posting, writes:

>I understand Minsky to have been referring to the concept of a finite
>automaton, as interpreted by a number of psychologists contemporary with
>that idea (we're talking late 1960s here).
>A finite automaton is a system in which the current state is the direct
>function of its previous state and previous inputs, or:
>This never accounted terribly well for the _experience_ of recall, however,
>i.e. of one's experience of internal representation: for which some degree
>of self-reference is required. Gordon Pask began to account for
>self-reference in cybernetic terms with his notion of finite automata
>working at a meta-level on lower-order automata (finite function machines,
>in fact, about which I've occasionally posted here) but I lost touch with
>his ideas a few years ago.
Agreed I was quoting Minksy out of context and pursuing an exact parallelism
wouldn't be particularly fruitful, Devi, and I'm with you on the failuer to
account for the experience of recall. I can't see how a finite automaton
would qualify for consciousness in the sense I'm groping after, since it's
current state in no way implies anything about it's previous state. I think
I'm going to have to get back to Roger Penrose, "The Emperor's New Mind",
and try to get all the way through this time. Too many interruptions
gettingin the way of serious thinking!

Since I wrote, though, I've been reading the section on Data Mining in the
latest issue of "Byte" (latest available to me here, that is) and it
encouraged some of my speculation. It also raised for me the problem of
whether, if there were "intelligent agents" out there in the data mining
software "community" that _had_ become conscious, how would we know? (And
would it matter that we didn't?) An intelligent agent (using the term in
the most general sense) given enough autonomy of action would seem to be a
prime candidate for forced (pseudo-?)construing and reconstruing. At one
level the interaction between human (conscious?)analyst and agent might be
construed as requiring the agent to reconstrue.

Still groping, I'm afraid,, but I'd like a copy of the paper you mention.