Artificial Intelligence - Her

For discussing anything related to physics, biology, chemistry, mathematics, and their practical applications.

Moderator: Flannel Jesus

Re: Artificial Intelligence - Her

Postby WendyDarling » Sat Mar 17, 2018 5:51 pm

I'm not following.
I AM OFFICIALLY IN HELL!

I live my philosophy, it's personal to me and people who engage where I live establish an unspoken dynamic, a relationship of sorts, with me and my philosophy.

Cutting folks for sport is a reality for the poor in spirit. I myself only cut the poor in spirit on Tues., Thurs., and every other Sat.
User avatar
WendyDarling
Heroine
 
Posts: 6980
Joined: Sat Sep 11, 2010 8:52 am
Location: Hades

Re: Artificial Intelligence - Her

Postby Meno_ » Sat Mar 17, 2018 6:04 pm

WendyDarling wrote:I'm not following.



Wendy , Your question invites an answer more than an off the cuff impromptu casual remark. So give me some time until tonight. In otjer words, Your question implies more profundity then my narrative.
Black Sun
Meno_
Philosopher
 
Posts: 3326
Joined: Tue Dec 08, 2015 2:39 am
Location: Mysterium Tremendum

Re: Artificial Intelligence - Her

Postby Meno_ » Sat Mar 17, 2018 8:31 pm

In other words, does the different functions of the hard and soft drives correspondingly analogous relationship to the ways the brain in itself works (, to the consciousness that is supposedly pan conscious), validates the comparison, that You brought up.

To give an example from Hinduism:. Most Brahmin will agree on Satori being an outcome of effects of Karmic Law.
The Satori is achieved when certain conditions are met, particularly in regard to egolessness and loss of identity.

Can it be at least ventured to inquire about the probability that A1 and it's construction can take up the slack and continue the process toward de-individuatism, so as to achieve Satori during one's lifetime?
Black Sun
Meno_
Philosopher
 
Posts: 3326
Joined: Tue Dec 08, 2015 2:39 am
Location: Mysterium Tremendum

Re: Artificial Intelligence - Her

Postby WendyDarling » Sat Mar 17, 2018 9:43 pm

Is Satori remembering one's essence?
I AM OFFICIALLY IN HELL!

I live my philosophy, it's personal to me and people who engage where I live establish an unspoken dynamic, a relationship of sorts, with me and my philosophy.

Cutting folks for sport is a reality for the poor in spirit. I myself only cut the poor in spirit on Tues., Thurs., and every other Sat.
User avatar
WendyDarling
Heroine
 
Posts: 6980
Joined: Sat Sep 11, 2010 8:52 am
Location: Hades

Re: Artificial Intelligence - Her

Postby fuse » Sat Mar 17, 2018 11:51 pm

Fuse, two things. One, you're absolutely right; let's all get in the spirit of extrapolating, and not merely getting hung up on the current and likely fleeting failures of today. Flying machines crashed until the Wright brothers came along. In science, thinking big is equal to thinking rational.

Oh yes, let's get spirited.

Gamer wrote:Once we KNOW this isn't happening in the bot, it becomes hard to pretend that this connection is legitimate.
Whereas it's much easier to pretend the connection is legitimate with an actual person.

Yup, I don't think I could do it. Which is why the thought experiment I mentioned hinged on unknowingly carrying on a relationship with an AI. Take someone who's real(?), imagine that THEY are AI. It seemed like a good jumping off point to get a sense of how non-trivial this issue could be - given that tech is likely to get there with AI on a long enough timeline.

Gamer wrote:Now let's get even scarier. What if, suddenly, I'm no longer needed for sex, love, affection, friendship? That's a problem. Because while half of my being is all about RECEIVING those things,
an equally important half is about feeling NEEDED for those things.

This shit is getting too scary for me, and the spirits are wearing off, man.

Gamer wrote:I'm using the word NEED, and it's a carefully chosen word. When a far-superior replacement becomes a genuine option, you are no longer NEEDED. It's quite possible to still be WANTED. But that's a precarious status. When that happens, there won't be a human being on earth who needs you for anything. They may still care if you live or die, but not for any reason other than quaint sentiment. And even that will dissipate after a while.

I guess when you think about it, we all already know that there are probably more amazing people out there than we've met so far - that the people you've met could've met someone else more amazing than yourself. We could have more amazing friends, gfs, bfs, etc. And so could our friends. It's lucky if we get an opportunity with someone who's "out of our league." Does the average guy/girl currently know or think somewhere deep down that they're only wanted in so far as they're available and good enough for someone else? Will AI just take this insecurity to the extreme? Or are we talking about another category of emotional trauma?

Gamer wrote:Imagine a world where you only ever come into contact with AI, and no actual human NEEDS you. That's where we're heading. That's what sucks about AI and Her. It's sad.

Is it inevitable?
User avatar
fuse
Philosopher
 
Posts: 4539
Joined: Thu Jul 20, 2006 5:13 pm

Re: Artificial Intelligence - Her

Postby Meno_ » Sun Mar 18, 2018 3:19 am

WendyDarling wrote:Is Satori remembering one's essence?



I think Sartori is a culmination effort between remembering and forgetting various stages of one's essential nature, where the apex is reached where neither state can be reassembled nor disassembled , because IT becomes , neither forgetting the past, nor re-constructing the future by using past as a model.

So it is not a re-construction, not a deconstruction, it merely a state of complete understanding , by utilizing all levels of the process by which it sustain It's self in time.
Black Sun
Meno_
Philosopher
 
Posts: 3326
Joined: Tue Dec 08, 2015 2:39 am
Location: Mysterium Tremendum

Re: Artificial Intelligence - Her

Postby WendyDarling » Sun Mar 18, 2018 7:50 pm

So it is not a re-construction, not a deconstruction, it merely a state of complete understanding , by utilizing all levels of the process by which it sustain It's self in time.

Understanding what to be what?
I AM OFFICIALLY IN HELL!

I live my philosophy, it's personal to me and people who engage where I live establish an unspoken dynamic, a relationship of sorts, with me and my philosophy.

Cutting folks for sport is a reality for the poor in spirit. I myself only cut the poor in spirit on Tues., Thurs., and every other Sat.
User avatar
WendyDarling
Heroine
 
Posts: 6980
Joined: Sat Sep 11, 2010 8:52 am
Location: Hades

Re: Artificial Intelligence - Her

Postby Meno_ » Mon Mar 19, 2018 1:01 am

WendyDarling wrote:
So it is not a re-construction, not a deconstruction, it merely a state of complete understanding , by utilizing all levels of the process by which it sustain It's self in time.

Understanding what to be what?


Understanding neither the embedded nor the technological information to be exclusively pre-eminent in either the subscribed from past to present development or some post-scribed modeling on a reconstructed paradigm.

It is a many layered de-facto imminent appraisal based on the most reliable estimates available, mainly irrespective or exclusive content.

The layering of levels may have veritable lapses of connectivity, that flow from either the past or the future into the present.

They become more assumptive them derivable in terms of either one or the other, putting the intelligence on more uncertainty. Quantum intelligence has more of a cut off reality as to whether where from and to what end they may be applicable.

This is the rationale used by analysts to be able to say that yes such computers have gross power potential , but an uncertain applicability.
Black Sun
Meno_
Philosopher
 
Posts: 3326
Joined: Tue Dec 08, 2015 2:39 am
Location: Mysterium Tremendum

Re: Artificial Intelligence - Her

Postby Gamer » Tue Mar 20, 2018 5:07 pm

Fuse wrote:Will AI just take this insecurity to the extreme? Or are we talking about another category of emotional trauma?


Not sure about the insecurity part. I get that when the "best version of X" is centralized/personalized, it could lead to a feeling of not being wanted/needed, which is something many of us already feel. A large portion of people are currently needed because of families. Children needing parents; we often need those closest to us to fulfill needs, with regard to services that are not centralized. Storytelling and music are centralized or server-based "the best version of those things," so we no longer NEED those closest to us, in our homes, village, etc, to provide those things. While we still need the emotional connection of touching, talking, help with daily challenges from our family members and friends by dint of proximity and physics, these sensations/services could also eventually be centralized or simulated as "best versions."

In any case, I've noticed that with the advent of the Internet and smart phones, I'm needed for ever fewer things from an expanding group of people close to me; this seems to be speeding up, not slowing. And also, I need fewer things from people. So, yes, it's quite possible children will no longer NEED their parents for feelings of security and emotional well-being b/c AI will handle it as well or better; this is the most profound example I can think of. I'm probably the best musician within a mile of where I live, but because of centralization and technology, I'm not needed for that, and I'm severely pale compared to what's available at your fingertips, the .01% of talent, the extreme outliers, winner takes all. Thus, instead of being a musician, I'm in marketing.

Fuse wrote:Is it inevitable?


Is it (not being needed by any living creature) inevitable? There's no reason why what I described MUST happen. Many scenarios could prevent it. At most, I'd say it's likely to happen, and we're on that path. That it even COULD happen gives me pause. I'm also not sure if people will feel "sad" if it happens. So many unknowns.

It's also possible good things can happen. Let's take a walk on the bright side: The most self-evident beauty in the human experience is in the simple things that you experience as a child – if you're lucky enough to be healthy and have nice parents & friends – the pure, unsuppressed emotions, laughter, fun, exploration, camaraderie, love, friendship, & building things out of passion, not duty. My hope is that machines remove the repetitive tasks of the unsheltered world as well as greatly reduce the unfair, needless hardships and suffering in life for so many, so that our calloused defenses of life's violences that distort our minds and hearts melt away...and we can feel childlike joy while having the minds of sages. I hope AI can help us achieve that, without taking it a step too far and digitizing/personalizing everything we consume through our five senses, in pursuit of a sort of heaven. But again, maybe the latter is best. Hard to say.
User avatar
Gamer
ILP Legend
 
Posts: 2143
Joined: Mon May 31, 2004 5:24 pm

Re: Artificial Intelligence - Her

Postby pilgrim-seeker_tom » Wed Mar 21, 2018 4:14 am

Can AI cure skin starvation? No way!


https://www.psychologytoday.com/us/blog ... can-do-you

I read everything I can on skin hunger. There isn't a lot out there, no one is studying it, which I think is a mistake.

A big thing I feel as a result of skin hunger is a sense of being unlovable. I relate contact with a certain sense of concern/love/intimacy, go long enough without being touched and you start to feel unworthy of these things. No one loves you, or is concerned about you, no one wants to be close to you. This is my main problem. I feel "less than", and it does a number on my mental state (depression) and my self confidence/image.

I think the worst part about skin hunger is that there is no viable quick fix. nowadays touch requires a long term investment in a relationship, that may be difficult to bring about with a diminished level of confidence. Simply, if I feel worthless it's hard to build a touch relationship with a friend/partner that can help resolve those feelings.

I've considered going to an "escort" to fulfill my touch needs. I never did it, first because of the illegality second because it would be fake. touch without emotional connection feels like pity.

You could also use a sex-surrogate therapist but that is expensive.

I just don't know. As a man it may be non stereotypical but I have this intense need to touch and be touched and every day I wake up and I feel worthless, unenthusiastic, depressed, and because men are supposed to be tough I don't have anyone to talk about these feeling safely and comfortably with.

Bottom line: skin hunger sucks.

Reply to andrew wilson
Quote andrew wilson
Skin hunger does suck
Submitted by Kory Floyd Ph.D. on September 1, 2013 - 11:55am
Thank you so much for writing. I certainly empathize with how you feel--this is exactly what skin hunger is. You're not atypical at all; millions of Americans feel what you feel to some degree (and men report more skin hunger than women do, on average). I've spent my whole career studying affection, so I know how important it is and how detrimental it feels when you don't get enough.

You're right that there aren't quick fixes, but that doesn't mean there aren't solutions. If you stay tuned with my blog, I'll be discussing in the coming weeks what people who are hungry for affection can do. For now, know that you're not alone and that your need for touch and affection is not only normal but healthy.
"Do not be influenced by the importance of the writer, and whether his learning be great or small; but let the love of pure truth draw you to read. Do not inquire, “Who said this?” but pay attention to what is said”

Thomas Kempis 1380-1471
User avatar
pilgrim-seeker_tom
Philosopher
 
Posts: 1858
Joined: Sun Dec 17, 2006 11:16 am

Previous

Return to Science, Technology, and Math



Who is online

Users browsing this forum: No registered users