(urth) Re: urth-urth.net Digest, Vol 5, Issue 41

maru marudubshinki at gmail.com
Fri Jan 28 15:27:09 PST 2005


That's not nearly certain:  Some think that emotions have a dual use: first,
to provide some a priori biases in the decision-making process, to speed 
things up
(this view derives from certain psychological disorders where emotions 
are largely stripped
out. The sufferers experienced great indecision, and often made very 
sub-optimal choices.)
and as a 'super-rational' evolutionarily developed strategy.  Ie, if 
everyone 'knows'
that you do not make rational choices, and are entirely willing to 
escalate, they will treat you
with deference. Sorta like the Russian or N. Korean nuclear strategies :)

~Maru

Dan'l Danehy-Oakes wrote:

>
>H'mmm. No. I'm suggesting that its experiential world might be far 
>more different from thine and mine than that of, say, a sociopath,
>or even a catatonic: at least as different as the experiential world
>of one of the "higher" animals. As far as I can tell, a sentient 
>machine would operate from, at least some, entirely different 
>_kinds_ of sensory input from humans, for example; nor would it
>likely be bashed about by hormonally-induced "moods."
>
>
>  
>
>>Labeling a machine as "insane" might be quite unfair, but pragmatically
>>would have to be done. A machine intelligence with perceptions and
>>motivations sufficiently different from our admittedly arbitrary norm would
>>be far more dangerous than any homicidal madman.
>>    
>>
>
>Only if it also had the ability to act upon its motivations. I see no reason
>why such a machine should be granted, for example, arms or any other
>manipulating appendage.
>
>--Blattid
>
>  
>




More information about the Urth mailing list