(urth) Re: urth-urth.net Digest, Vol 5, Issue 41
Dan'l Danehy-Oakes
danldo at gmail.com
Fri Jan 28 13:13:21 PST 2005
On Thu, 27 Jan 2005 18:16:28 -0800 (PST), turin <turin at hell.com> wrote:
> it is not enough for a machine to produce the same outputs as a person,
> it must arrive at its outputs in the same fashion, that is whatever the
> functional apparatus, it must have the same cognition and intentional
> and phenomenal states as a person.
How would you be able to demonstrate that it had these "states"? How
can you define them in humans? To the best of my knowledge neither
of these questions has any empirically testable answer to date.
> the imitation game taken literally is the same kind of reductionism
> as behaviorism.
The imitation game stands as the only empirical test thus far
proposed for identifying a machine intelligence, if one should
arise. It has several flaws: I find most significant among these
its failure to account for the probability that the experiential life
of a machine intelligence would differ in important ways from that
of a human; thus, it would tend to eliminate a (hypothetical as of
today) class of genuinely sentient mechines which were not good
at imagining and empathizing with the experiential life of humans.
> the test itself was just the "chinese room",
You betray your prejudices by even mentioning Searle's quibble.
> for a mahine to be a person it has to have, and in the 40s and
> 50s this was a taboo subject in psychology, a mind.
This grotesquely misrepresents the psychological institutions of
the period: this was the heyday of behaviorism, but behaviorism
was never the sole dominant school of psychology.
--Dan'l
--
"We're going to sit on Scorsese's head"
-- The Goodfeathers
More information about the Urth
mailing list