(urth) 5HC : Chinese boxes or tea chests?
thewynns at earthlink.net
Tue Feb 1 14:44:41 PST 2005
>>Yes, but to be fair Turing's criteria is just as subjective.
>Ummm, no. It _is_ subjective - or, rather, it is based on subjective
>judgments - but it provides a definition (admittedly, an ostensive
>rather than a descriptive definition), and a way of testing for
>whether that definition has been met. Searle (to the best of
>my memory) does neither.
But since a camouflage device can always get better and better (particularly
since our A.I.'s creators can experiment to discover the chinks in its
disguise) it is a standard that can likely be met without ever addressing
whether the definition is valid (just as the mother bird's criteria for
whether the coo-coo is her child is that it was hatched in her nest.
Searle's analogy begs all sorts of questions, but it is a set up so that the
definitions are implicit: the person in the box represents the expert system
A.I. and it "acts" without "thinking". Remember, the original question that
Turing was addressing was "can a computer 'think'?".
>>Roughly, "can a computer succeed in impersonating a human
>>person well enough to fool humans." Where is the "control" here?
>Read Turing's article. The "imitation game" was to be played by
>two players, a human and a computer, with a second human
>judging between them - "which is the human and which is the
>computer?" The test is, of course, blind - the judge doesn't know
>which player is which. (In a really rigorous test, I would suggest that
>runs be done with all possible combinations (not only human vs
>computer but human vs human and computer vs computer.) Both
>players are trying to convince the judge that they are the human.
>Thus, the responses of the human player provides the control to
>which the responses of the computer player are compared.
That comes out to little more than an opinion poll if one assume that humans
are generally equal in their ability to be duped. Not much of a "control".
On the other hand, even if a computer were conscious, a computer
programmer who designed such systems would probably always be able
to identify certain algorithms to recognize it.
More information about the Urth