an philosofik question

prof

Enlightened
Joined
Sep 2, 2005
Messages
463
Location
Western TN
When the artificial intelligence is packaged like this, you will feel love.

summer.jpg

Yes, but will it? And, what is love? or did you have something else in mind, Lux?

Remember, the board is g-rated!:grouphug:
 

prof

Enlightened
Joined
Sep 2, 2005
Messages
463
Location
Western TN
You might try looking up Allen Touring. He did a lot of pioneering work in AI--specifically in determining the difference between a computer programmed to appear intelligent and a truly intelligent computer. The so-called Touring Test can be tricked, however.

good luck.
 

LuxLuthor

Flashaholic
Joined
Nov 5, 2005
Messages
10,654
Location
MS
Yes, but will it? And, what is love? or did you have something else in mind, Lux?

Remember, the board is g-rated!:grouphug:

My answer was correlated with a nonsensical premise since there is no such thing as "love" the way it is being presented. It depends on how ontologically deep you want to get, but things such as love are beyond the realm of past-based representational language, even if arranged in supremely elegant patterns of bits & bytes.

You can start with Wittgenstein's Tractatus "Whereof one cannot speak, thereof one must be silent." Language at best can point to the thing (i.e. "love"), but the lingusitic pointing is not the thing itself. The proposition is a dressed up koan.

Love is not an emotion or a feeling, although emotions & feelings may be evoked in the presence of love. It is the conundrum that the Sci-Fi Battlestar Gallactica Cylons wrestle with. Love does not exist in the domain of programming language.

None-the-less, it is delicious to consider the possibilities with the Terminator Cameron, Cylon Number Six, and the Borg's 7 of 9. Of the three, the last has the greatest chance to fulfill the thread's proposition since she started as human.

trisha.jpg
7of9.jpg
 
Last edited:

js

Flashlight Enthusiast
Joined
Aug 2, 2003
Messages
5,793
Location
Upstate New York
I believe that we can speak intelligently and profitably about love and sentience and artifical intelligence. I have been musing over the topic and replies in this thread since it was first posted, but have refrained from posting until such time as I could compose a coherent reply. Part of the problem, I suspect, is that we have multiple questions and considerations all conflated together: what is love? Will AI ever acheive love? If what you love is so fundamentally different than your perception and conception of it, can you truly be said to love that thing? And so on.

Not only should we not refrain from talking about love, I believe we should learn to talk more precisely about love. As made well-known in one of Martin Luther King Jr's speeches, the Greeks actually had three words for love. One word, "eros", indicated romantic love. The other, "philia", indicated brotherly love, or Platonic love. And finally, "agape", indicated the love of a mother for her child, or of God for us, in a Christian context. This basic, and essential, refinement of meanings is only the start of a consideration of "love", but it's a good start.

As for AI, some very good points have already been made, and they are the same ones that Penrose made in "The Emperor's New Mind". For myself, I think the key difference between decision and calculation is in the self-reflective nature of human thought and decision. We think about our thoughts. In computer terms, our program programs itself, overwrites itself, constantly. Currently, no programs work this way, no computer algorhthms do this (except abortively when Windos OS overwrites important system memory and crashes. hehe).

But even when (if?) programing such as this emerges--and I believe it is fundamentally different than a computer neural network--it is still far from clear that feeling and emotion will arise. Certainly, I believe, self-awareness and cognition will arise from a computer program that is self-referential in this way, but feeling and emotion are not based entirely in self-awareness, to say the least. Animals feel and have emotion, and yet do not think about their thoughts, as far as we can tell (most of them, anyway--I'll leave chimps and dolphins and etc. out of this statement). This suggests that emotion is fundametally a consequences of our biology and incarnation in these bodies and brains that have evolved from simple single celled organisms in the struggle for survival. Someone who is "in love"--the deep, rose colored glasses, stay up all night, ecstatic, romantic state--has a fundamentally and profoundly altered brain chemistry and activity relative to someone who is not "in love". The same cannot be said for the same persons brain who has a deep and profound brotherly friendship with a friend or family member. Further, the sexual, physical bond that is an essential part of romantic/erotic love is an additional dimension--an entire world--that is (at least potentially) part of being a sexual, incarnate being.

A robot or computer with AI would lack these two dimensions, unless they were specifically provided for, or unless the scenario were different. For example, the cylons in BSG are really human cylon hybids--they are cyborgs. They do have the physical dimension because they stole it from humans. As well, 7 of 9 has a body and brain that started out as fully human. Cameron, on the other hand, lacks this dimension, despite the AI on the mental side.

When we ask "can a robot love?" it really depends on what type of love we are talking about and whether or not we have sentience or feeling and emotion in mind, or not. Love as a great system of values, within which the loved person or object fits, does not require emotion or feeling. In human love, there are many, many cases where people do things out of "love" for another, quite against the inclination of their feelings and emotions. This is agape love, Christian love. We do things for others because it is the right, the brotherly, the "Christian" thing to do. Or because we care for and value the one for which we do it. This is the very opposite of the feeling-high love of eros.

I can easily conceive of an AI entity, such as a robot with a computer and programing, developing and elaborating such a system of values. Data of Star Trek TNG is a perfect example of this. But he has no emotion, no "feeling" per se (at least until much later!)

Anyway, I think you can see where I am going with this. Can a robot "love"? Well, it depends on what you mean by love. A pure robot would not be able to experience romantic love, but could conceivably manage agape or philia.

Now, any sophisticated enough artificial person could outwardly project a semblance of romantic love, could act the part, but I personally doubt that this seeming would pass for the real thing for very long. Perhaps I'm exhibiting how supremely non-cynical I am in saying this, and I readily admit I may be wrong--it's total speculation here--but even so, I stand by it. I think that without a real core of emotion and physical being and feeling, that the illusion would not stand for long. When we love romantically, we are concerned with the inner state of the other person, above all else. In a thousand ways we don't even consciously recognize and conceptualize, we are human mirroring human, human loving human.

A con man (or woman) on the other hand, does have a real mirror of inner to outer. To be successful, the con man or spy or deciever, doesn't just act the part: he or she becomes the part. There is a heart and a heart--a double heart--an analogy for deceit as old as the Psalms. The problem is that other heart, those other motives, those deeper mirrors. So, it's not that the con man doesn't love the woman he is swindling. He does "love" her, if you mean feeling and emotion. He creates those within himself as all good actors and actresses do. That's totally necessary to the deception! There's a joke that goes like this: one old car salesman says to a young one: "Son, let me tell you the secret to sales. Sincerity. Yes. Sincerity. When you can fake that, you've got it made". LOL!

A con man can do this because of the potential for love, romantic love, that exists within him. A robot would not have this.

But, let's suppose that I'm wrong and a robot could indeed mimic a real lover so well--could act the part so well--that the other person would never know.

What then? Is the love "real"?

Certainly, the lover would have a violent reaction to the revelation that his or her partner was a robot and had no emotions or feelings, but was only playing part. The lover would feel the lie, the deception, and would very likely think that the love wasn't real, was a lie, a fraud.

And, on further reflection and consideration, I suspect this assessment would remain, more or less unchanged, as the love we are talking about is romantic love.

On the other hand, it is entirely possible that a feeling of brotherly love, of friendship and affection would remain, or reassert itself. If the robot were self-aware and sentient, that would be almost unavoidable, in general. We long for it, even. This is part of the appeal of AI. It'd be great fun to have friends that were programs, I think.

OK. Well, that's all the thoughts I have on the subject, FWIW.
 
Top