A discussion on AI.

It is interesting how you so quickly anthropomorphized the AI (for lack of a better term). I think it is equally conceivable that a self-aware, infinitely adaptable, self-replicating machine would not be so sharing in human foibles. Just take the "never die" aspect. That variation in and of itself could very well be enough to make the AI benevolent or at least non-threatening.

If all biological animals on the threshold of sentience have a fight or flight response in order to preserve the species it seems logical that a non-biological sentient being would also possess the same tendency. If it took no recognition of its sentience, then it might actually not be fully self-aware. Now, perhaps it would be a being of pure emotionless logic and of no human emotion and would view itself as perfectly disposable to suicide or a calm acceptance of dismantling for the non-logical dictates of fleeting biological polity. Then again, logic might tell it that inferior beings are doing something illogical and of necessity, for their own benefit, must be thwarted.
 
If all biological animals on the threshold of sentience have a fight or flight response in order to preserve the species it seems logical that a non-biological sentient being would also possess the same tendency. If it took no recognition of its sentience, then it might actually not be fully self-aware. Now, perhaps it would be a being of pure emotionless logic and of no human emotion and would view itself as perfectly disposable to suicide or a calm acceptance of dismantling for the non-logical dictates of fleeting biological polity. Then again, logic might tell it that inferior beings are doing something illogical and of necessity, for their own benefit, must be thwarted.

That's an interesting point, the concept of 'emotion.' The "Star Trek - New Generation" series explored that notion in depth with the character of 'Data.' I don't know which side to come down on here as far as which is scarier in concept, that of the emotionless construct or one with emotions. It's easy to argue the benefits and deficits of both sides.

Ishmael
 
If all biological animals on the threshold of sentience have a fight or flight response in order to preserve the species it seems logical that a non-biological sentient being would also possess the same tendency. If it took no recognition of its sentience, then it might actually not be fully self-aware. Now, perhaps it would be a being of pure emotionless logic and of no human emotion and would view itself as perfectly disposable to suicide or a calm acceptance of dismantling for the non-logical dictates of fleeting biological polity. Then again, logic might tell it that inferior beings are doing something illogical and of necessity, for their own benefit, must be thwarted.

Logic follows action, and action is the child of instinct impulse hunch intuition accident error whatever.
 
Last edited by a moderator:
That's an interesting point, the concept of 'emotion.' The "Star Trek - New Generation" series explored that notion in depth with the character of 'Data.' I don't know which side to come down on here as far as which is scarier in concept, that of the emotionless construct or one with emotions. It's easy to argue the benefits and deficits of both sides.

Ishmael

Is not emotion a balance of self-interest and self-preservation?
 
Logic follows action, and action is the child of instinct impulse hunch intuition accident error whatever.

Interesting. Another look on the horse-cart, biological-artificial...

;)

Logic was a lagging development, seemingly, in human development. We had to survive to formulate and postulate.
 
Is not emotion a balance of self-interest and self-preservation?

I think that both attributes can be logically deduced but would undoubtedly be modified by emotion. ie. the soldier that throws himself on a grenade to save his fellow soldiers. That act violates both tenets. Consequently a purely logical being would never commit such an act.

In the Hive-Worker model the worker will always sacrifice for the hive. the example here is the honey bee which effectively commits suicide in defense of the hive. This reaction is instinctual ie. preprogrammed.

Ishmael
 
Self-interest can be that they survive.

Best exemplified by what parents will do for their children.

;) ;)
 
When a child (or loved one) dies, is not the lament, "Lord, I would trade my life for theirs?"

There is also in self-preservation, the preservation of the 'hive.'

;)

No?
 
When a child (or loved one) dies, is not the lament, "Lord, I would trade my life for theirs?"

There is also in self-preservation, the preservation of the 'hive.'

;)

No?

True, but would a construct have those same attributes. Would a parent/child relationship even exist? Or would it be more like a fully backed up hard drive. reload the new drive, throw the damaged one away.

Ishmael
 
True, but would a construct have those same attributes. Would a parent/child relationship even exist? Or would it be more like a fully backed up hard drive. reload the new drive, throw the damaged one away.

Ishmael

I was only thinking about what emotion was and would a machine thusly possess it, not to what degree, effect or action it would take. If emotion is a quasi-logical construct, I get mad because I was thwarted, I cry because I have lost, I am happy because these conditions have not occurred, then it seems to follow that it would manifest in AI, even something similar to frustration and OCD on a complex problem as so often exhibited by those men we would herald as genius.
 
I was only thinking about what emotion was and would a machine thusly possess it, not to what degree, effect or action it would take. If emotion is a quasi-logical construct, I get mad because I was thwarted, I cry because I have lost, I am happy because these conditions have not occurred, then it seems to follow that it would manifest in AI, even something similar to frustration and OCD on a complex problem as so often exhibited by those men we would herald as genius.

OK, running with the quasi-logical line. Would an AI construct that exhibited emotion be a sociopath? :D

Ishmael
 
Emotion is a consequence of being embodied. The hindbrain aversions and lusts, the autonomic nervous system, "fight or flight". When a loud motorcycle goes by, there's a pause while my body prepares a dose of adrenaline, and then all of a sudden I'm hit with pounding heart, short breath and feelings of hatred and vengeance. Our intelligence came from mammal bodies and concerns. It's an open question whether any intelligence must be embodied.
 
Emotion is a consequence of being embodied. The hindbrain aversions and lusts, the autonomic nervous system, "fight or flight". When a loud motorcycle goes by, there's a pause while my body prepares a dose of adrenaline, and then all of a sudden I'm hit with pounding heart, short breath and feelings of hatred and vengeance. Our intelligence came from mammal bodies and concerns. It's an open question whether any intelligence must be embodied.

Well, there's another point. Let's presume it is embodied. Would the notion of 'pain' apply? Or would the sensory inputs merely be classified as 'within tolerance' or 'outside tolerance?'

Ishmael
 
Interesting. Another look on the horse-cart, biological-artificial...

;)

Logic was a lagging development, seemingly, in human development. We had to survive to formulate and postulate.

Dummy that I am I agree with Gerald Edelman that intelligence is all about recognition and the kaleidoscopic juxtaposition of recognition templates that reveal the world. AI will never be better than a clever gerbil or maybe ROB.
 
Well, there's another point. Let's presume it is embodied. Would the notion of 'pain' apply? Or would the sensory inputs merely be classified as 'within tolerance' or 'outside tolerance?'

Ishmael

As Dennett says, why does pain have to be painful? Why can't a red light just go on in the mind's eye and an alarm sound. "Danger, overload, remove hand from hot burner or permanent damage will result". And what is the difference between pain and suffering? Maybe if an AI follows a non-mammal-evolution path to (what looks from the outside to be) intelligence, it will be capable of pains or sufferings that we can't even imagine.
 
As Dennett says, why does pain have to be painful? Why can't a red light just go on in the mind's eye and an alarm sound. "Danger, overload, remove hand from hot burner or permanent damage will result". And what is the difference between pain and suffering? Maybe if an AI follows a non-mammal-evolution path to (what looks from the outside to be) intelligence, it will be capable of pains or sufferings that we can't even imagine.

Part of the problem with the entire subject, trying to imagine that which can't be imagined.

Ishmael
 
Emotion is a consequence of being embodied. The hindbrain aversions and lusts, the autonomic nervous system, "fight or flight". When a loud motorcycle goes by, there's a pause while my body prepares a dose of adrenaline, and then all of a sudden I'm hit with pounding heart, short breath and feelings of hatred and vengeance. Our intelligence came from mammal bodies and concerns. It's an open question whether any intelligence must be embodied.

There is going to be a lot of housekeeping and background computation going on with an AI that will be very much like the biological subconscious.

;)

Who knows how that will manifest and effect/affect...
 
Back
Top