Moderator: Flannel Jesus
WendyDarling wrote:I'm not following.
Fuse, two things. One, you're absolutely right; let's all get in the spirit of extrapolating, and not merely getting hung up on the current and likely fleeting failures of today. Flying machines crashed until the Wright brothers came along. In science, thinking big is equal to thinking rational.
Gamer wrote:Once we KNOW this isn't happening in the bot, it becomes hard to pretend that this connection is legitimate.
Whereas it's much easier to pretend the connection is legitimate with an actual person.
Gamer wrote:Now let's get even scarier. What if, suddenly, I'm no longer needed for sex, love, affection, friendship? That's a problem. Because while half of my being is all about RECEIVING those things,
an equally important half is about feeling NEEDED for those things.
Gamer wrote:I'm using the word NEED, and it's a carefully chosen word. When a far-superior replacement becomes a genuine option, you are no longer NEEDED. It's quite possible to still be WANTED. But that's a precarious status. When that happens, there won't be a human being on earth who needs you for anything. They may still care if you live or die, but not for any reason other than quaint sentiment. And even that will dissipate after a while.
Gamer wrote:Imagine a world where you only ever come into contact with AI, and no actual human NEEDS you. That's where we're heading. That's what sucks about AI and Her. It's sad.
WendyDarling wrote:Is Satori remembering one's essence?
So it is not a re-construction, not a deconstruction, it merely a state of complete understanding , by utilizing all levels of the process by which it sustain It's self in time.
WendyDarling wrote:So it is not a re-construction, not a deconstruction, it merely a state of complete understanding , by utilizing all levels of the process by which it sustain It's self in time.
Understanding what to be what?
Fuse wrote:Will AI just take this insecurity to the extreme? Or are we talking about another category of emotional trauma?
Fuse wrote:Is it inevitable?
I read everything I can on skin hunger. There isn't a lot out there, no one is studying it, which I think is a mistake.
A big thing I feel as a result of skin hunger is a sense of being unlovable. I relate contact with a certain sense of concern/love/intimacy, go long enough without being touched and you start to feel unworthy of these things. No one loves you, or is concerned about you, no one wants to be close to you. This is my main problem. I feel "less than", and it does a number on my mental state (depression) and my self confidence/image.
I think the worst part about skin hunger is that there is no viable quick fix. nowadays touch requires a long term investment in a relationship, that may be difficult to bring about with a diminished level of confidence. Simply, if I feel worthless it's hard to build a touch relationship with a friend/partner that can help resolve those feelings.
I've considered going to an "escort" to fulfill my touch needs. I never did it, first because of the illegality second because it would be fake. touch without emotional connection feels like pity.
You could also use a sex-surrogate therapist but that is expensive.
I just don't know. As a man it may be non stereotypical but I have this intense need to touch and be touched and every day I wake up and I feel worthless, unenthusiastic, depressed, and because men are supposed to be tough I don't have anyone to talk about these feeling safely and comfortably with.
Bottom line: skin hunger sucks.
Reply to andrew wilson
Quote andrew wilson
Skin hunger does suck
Submitted by Kory Floyd Ph.D. on September 1, 2013 - 11:55am
Thank you so much for writing. I certainly empathize with how you feel--this is exactly what skin hunger is. You're not atypical at all; millions of Americans feel what you feel to some degree (and men report more skin hunger than women do, on average). I've spent my whole career studying affection, so I know how important it is and how detrimental it feels when you don't get enough.
You're right that there aren't quick fixes, but that doesn't mean there aren't solutions. If you stay tuned with my blog, I'll be discussing in the coming weeks what people who are hungry for affection can do. For now, know that you're not alone and that your need for touch and affection is not only normal but healthy.
Erik_ wrote:What do you think about the concept of an artificial intelligence that is designed to be a significant other?
WendyDarling wrote:Is Satori remembering one's essence?
Gamer wrote:It's not only very possible for humanlike AI to be designed, but plausible that humans will adopt it to meet all manner of needs, physical, emotional, etc, to varying degrees of success.
We turn to actual humans to "varying degrees of success," too.
It can and will happen. I can support that with clear enough arguments but won't do that here. It's a great philosophical question, whether this SHOULD happen.
Like any question on the ought side of the is/ought divide, it helps to add the word "if."
We ought to do X, if we want Y.
Well, it comes down to what people will want, and like anything else, there are healthy wants, unhealthy wants, and health-neutral wants that come down to taste.
I can make a clear argument that a large part of the spectrum of AI stand-ins for human relationships will fall under the "health neutral" category.
All that's left is for a third party to subjectively judge if the dynamic is "sad," "fine" repugnant" "acceptable" "beautiful," etc.
Within dreams, I sometimes meet people who don't exist, and I feel an emotional bond with that entity while I'm dreaming. I don't find this sad, until i wake up.
On reflection, the entity was real, it was a vestige of some inner component of my mind that for a moment was separate and discreet from my conscious first-person awareness.
When we project our desires and perceptions onto an AI, we may be doing something similar, combined with the fact that an AI can be an extension of human traits, and is thus
a way to connect to humanity, albeit indirectly, through a substrate.
How often have you felt intimacy with your favorite author? We connect to souls and ethos thru artificial substrates all the time.
I understand the informal fallacy that kicks in when we deny the possibility of lifelike AI that we can fall in love with, etc. We are afraid because it's weird, grotesque. We render ourselves instant fools. Lunatics talking to dolls. There's something we naturally find pathetic about this. But so much of the human condition is already weird and grotesque, and pathetic. Consider that the only people you ever know are actually projections in your mind, reconstructions of only a tiny part of the reality of the source being, assuming a source being even exists that's in any way similar to what you think it is or want it to be. At least with AI we
gain some measure of control.
Humans often have illusions and world views foisted upon them. They are blind to the origins of their epistemology. It's a philosopher's job to knowingly choose her illusion & embrace it in the spirit of a Gamer. We do it all the time. Some of us will indeed choose to love AI, in the way Berkley, Wittgenstein, Sisyphus, analytic philosophers and existentialists, or any of the great solipsists, choose to play along, feel, live, and love, and somehow get by as a normal person in a sea of abstraction. Just as some of us who know better can choose, like Kierkegaard, Tolstoi, William James, etc., to love God. I don't know where I'll wind up, but I'm not naive to the eventuality of a genuine choice heading my way, and yours.
It is quite difficult to have an AI behave exactly like a human though.
Gamer wrote:It is quite difficult to have an AI behave exactly like a human though.
It is right now, but it's getting easier every day.
Humans don't behave so great. Go grocery shopping and see how people behave.
I'm not knocking people, they're FINE.
But the Luddite statement that it's hard RIGHT NOW is forgetting that trying to figure out what's possible is a rapidly moving target.
The future has a way of slowly erupting into the present. Reverse engineering human behavior is not only hard, it's stupid.
We'd want to engineer better-than-human behavior, because otherwise, we'd just keep dealing with each other, which clearly is a failing enterprise.
We pass by each other, millions of us, and instead of smiling and hugging, we ignore each other, with a sad solemn look on our faces.
That's how humans generally behave, and there are worse examples, much worse.
Return to Science, Technology, and Math
Users browsing this forum: No registered users