Artificial Intelligence - Her

As far as I’m concerned, everything is energy of varying types and speeds, agreed. I find truth in all the religions and if all that truth were properly retrieved and combined, there might develop a complete picture of us.

A soul is like a PC hard drive here in this dimension, the brain is like the processor and RAM, the peripherals, such as the keyboard, monitor, and speakers are like the senses of our fleshy bodies relaying information through one part to another. In the dimensional array I mentioned, our souls operate outside of our fleshy body without needing it at all, that array is a whole n’other ballpark.

The question is, whether our outside the realm of the fleshy body’s soul operating there, what is the relation of that with the outside soul qua. consciousness : so that the difference legitimizes the distinction made : between consciousness and soul?

What makes for the semantic distinction? The American Indians believe in the Big Soul in the sky, wherein this distinctive feature is mitigated by no dual aspect between inside and outside the fleshy body. Is this all about semantics?

The point of the computer analogy is well taken, but isn’t that an example of tying up a post inductive realization with a former undifferentiated archytype? Does the simulated analogy of using a computer as an extension of the brain not directly implying that intelligence and artificial intellogence are not merely analogous (in the sense of hard and soft drive features) , but more probably identical aspects of the same thing.

The soul houses consciousness and all memories. Is that what your asking? There is no consciousness outside of a soul concerning us as beings.

Partly,yes. But more more to the point, that of that’s the case, then why does the content changes when it is outside its individual embodied characteristic? For this isn’t he sticking point: that after embodiment, the exact arrangement breaks up and disperse randomly.

I’m not following.

Wendy , Your question invites an answer more than an off the cuff impromptu casual remark. So give me some time until tonight. In otjer words, Your question implies more profundity then my narrative.

In other words, does the different functions of the hard and soft drives correspondingly analogous relationship to the ways the brain in itself works (, to the consciousness that is supposedly pan conscious), validates the comparison, that You brought up.

To give an example from Hinduism:. Most Brahmin will agree on Satori being an outcome of effects of Karmic Law.
The Satori is achieved when certain conditions are met, particularly in regard to egolessness and loss of identity.

Can it be at least ventured to inquire about the probability that A1 and it’s construction can take up the slack and continue the process toward de-individuatism, so as to achieve Satori during one’s lifetime?

Is Satori remembering one’s essence?

Oh yes, let’s get spirited.

Yup, I don’t think I could do it. Which is why the thought experiment I mentioned hinged on unknowingly carrying on a relationship with an AI. Take someone who’s real(?), imagine that THEY are AI. It seemed like a good jumping off point to get a sense of how non-trivial this issue could be - given that tech is likely to get there with AI on a long enough timeline.

This shit is getting too scary for me, and the spirits are wearing off, man.

I guess when you think about it, we all already know that there are probably more amazing people out there than we’ve met so far - that the people you’ve met could’ve met someone else more amazing than yourself. We could have more amazing friends, gfs, bfs, etc. And so could our friends. It’s lucky if we get an opportunity with someone who’s “out of our league.” Does the average guy/girl currently know or think somewhere deep down that they’re only wanted in so far as they’re available and good enough for someone else? Will AI just take this insecurity to the extreme? Or are we talking about another category of emotional trauma?

Is it inevitable?

I think Sartori is a culmination effort between remembering and forgetting various stages of one’s essential nature, where the apex is reached where neither state can be reassembled nor disassembled , because IT becomes , neither forgetting the past, nor re-constructing the future by using past as a model.

So it is not a re-construction, not a deconstruction, it merely a state of complete understanding , by utilizing all levels of the process by which it sustain It’s self in time.

Understanding what to be what?

Understanding neither the embedded nor the technological information to be exclusively pre-eminent in either the subscribed from past to present development or some post-scribed modeling on a reconstructed paradigm.

It is a many layered de-facto imminent appraisal based on the most reliable estimates available, mainly irrespective or exclusive content.

The layering of levels may have veritable lapses of connectivity, that flow from either the past or the future into the present.

They become more assumptive them derivable in terms of either one or the other, putting the intelligence on more uncertainty. Quantum intelligence has more of a cut off reality as to whether where from and to what end they may be applicable.

This is the rationale used by analysts to be able to say that yes such computers have gross power potential , but an uncertain applicability.

Not sure about the insecurity part. I get that when the “best version of X” is centralized/personalized, it could lead to a feeling of not being wanted/needed, which is something many of us already feel. A large portion of people are currently needed because of families. Children needing parents; we often need those closest to us to fulfill needs, with regard to services that are not centralized. Storytelling and music are centralized or server-based “the best version of those things,” so we no longer NEED those closest to us, in our homes, village, etc, to provide those things. While we still need the emotional connection of touching, talking, help with daily challenges from our family members and friends by dint of proximity and physics, these sensations/services could also eventually be centralized or simulated as “best versions.”

In any case, I’ve noticed that with the advent of the Internet and smart phones, I’m needed for ever fewer things from an expanding group of people close to me; this seems to be speeding up, not slowing. And also, I need fewer things from people. So, yes, it’s quite possible children will no longer NEED their parents for feelings of security and emotional well-being b/c AI will handle it as well or better; this is the most profound example I can think of. I’m probably the best musician within a mile of where I live, but because of centralization and technology, I’m not needed for that, and I’m severely pale compared to what’s available at your fingertips, the .01% of talent, the extreme outliers, winner takes all. Thus, instead of being a musician, I’m in marketing.

Is it (not being needed by any living creature) inevitable? There’s no reason why what I described MUST happen. Many scenarios could prevent it. At most, I’d say it’s likely to happen, and we’re on that path. That it even COULD happen gives me pause. I’m also not sure if people will feel “sad” if it happens. So many unknowns.

It’s also possible good things can happen. Let’s take a walk on the bright side: The most self-evident beauty in the human experience is in the simple things that you experience as a child – if you’re lucky enough to be healthy and have nice parents & friends – the pure, unsuppressed emotions, laughter, fun, exploration, camaraderie, love, friendship, & building things out of passion, not duty. My hope is that machines remove the repetitive tasks of the unsheltered world as well as greatly reduce the unfair, needless hardships and suffering in life for so many, so that our calloused defenses of life’s violences that distort our minds and hearts melt away…and we can feel childlike joy while having the minds of sages. I hope AI can help us achieve that, without taking it a step too far and digitizing/personalizing everything we consume through our five senses, in pursuit of a sort of heaven. But again, maybe the latter is best. Hard to say.

Can AI cure skin starvation? No way!

psychologytoday.com/us/blog … can-do-you

[b]

[/b]

I wouldn’t classify it as significant unless it has a reproductive system, with that said it will only be on par with a advanced cyber pet, comfort brought on by entertainment and a convenient distraction away from the real, to replace that which is lacking with a plug that maintains no life affirming objective. For those who seek not to reproduce then i personally don’t see it as a problem as such, because those who do seek to reproduce will never resort to AI…

If it improves people’s living standards before they inevitably pass…

why not, who cares?

You cannot have sex with A.i, sexual intercourse is between to biologically compatible beings resulting in reproduction. At the very most, when it comes to AI, sex would be an artificial act that “sexually” stimulates the human body into an aimless release.

If the human is not fit enough to bring forth itself within the world then it will only be exploited by the technologies of other men.

Look how peaceful this A.I. is compared to the human.

[youtube]https://www.youtube.com/watch?v=gv5nBLcbqzs[/youtube]

How typical.
We will be irellevent.

I just found I missed the answer to this question.

I think Sartori is not remembering your particular essence , but being able to let go of it.

Even to the point of sustaining a hope that even a permanent record or memory is forgot.

Remember Saint James? He proved that there is always am identical You, to every being, and I didn’t really understand it , pm’d him and he proved it mathematically, and I was going to involve him more, because the math was too difficult. I would like to dog it up by hours of search , but that came to no end.

His point was that you can’t ever really let go because there is always a copy.

Yes, I agree. It is quite difficult to have an AI behave exactly like a human though.

It is right now, but it’s getting easier every day.

Humans don’t behave so great. Go grocery shopping and see how people behave.

I’m not knocking people, they’re FINE.

But the Luddite statement that it’s hard RIGHT NOW is forgetting that trying to figure out what’s possible is a rapidly moving target.
The future has a way of slowly erupting into the present. Reverse engineering human behavior is not only hard, it’s stupid.
We’d want to engineer better-than-human behavior, because otherwise, we’d just keep dealing with each other, which clearly is a failing enterprise.
We pass by each other, millions of us, and instead of smiling and hugging, we ignore each other, with a sad solemn look on our faces.
That’s how humans generally behave, and there are worse examples, much worse.

Yes, but the opposite is true as well. We pass by each other with impassive simulations of self pity, until the energy, or lack of it bursts the dams of understanding, and a war breaks out.
First a war of words, then when that doesent work out actual wars.
After that, a peace seems appropriate to install the necessity of smiling faces, and a happy society-meaning dare not pass your fellow man with a drawn, for that would disservice for the innocents who posed as heroes and sacrificed so much?

The sacrifice of course, an instituted code of internalization of the transformativeally processed chance of loss of control, whete the leaders see higher value in social control from the top, then the uniform code of military justice appears to embrace, to enhance the value of victims of those who need to enforce them.

The reverse indicated of frowns in the anonymous situations, brevies up to an immediate adoption of smiling faces, as if, the need to show graces.

That kind of justice begs the underpinnings of military justice and pits it against social injustice.

The romantic idiom and its existential meltdown into bad faith is merely a contraption to bottle that up, and keep it high above a reality which machine extracted it’s soulvia as a Faustian trick.

Good Show!

That the show seems to get better by better adoption of higher machined forms of simulation, goes with the argument for and not against the one dimensional man:

Alfred E Newman need not worry, for he has faith in big brother, and as its similitude nears the real thing, the difference will become unnoticeable and the Faustian bargain will become perfected. No way to wiggle out, and no need, since the product and it’s means production will become invariably tied on a planetary stage.

PS: sometimes you have to say progressively more to try to glean rexuuctibly less meaning out of it.